Sign In

Neta Lumina [FP8 scaled]

36

486

8

Updated: Nov 22, 2025

base model

Verified:

SafeTensor

Type

Checkpoint Trained

Stats

243

0

Reviews

Published

Oct 10, 2025

Base Model

Lumina

Hash

AutoV2
3B153BB109

License:

This page contains fp8 scaled DiT models of Neta Lumina.

All credit belongs to the original model author. License is the same as the original model.

Note: as for writing this (11/22/2025), ComfyUI does not support full fp8 mode for Lumina 2, all calculations are still bf16. Thus using fp8 checkpoint won't make the model run faster.


About "scaled fp8":

  • Comfyui supports it out-of-the-box. You don't have to change anything, just load it as a normal model using the same loader node.

  • "scaled fp8" is not fp8. "scaled fp8" can give you identical quality comparing to the original model.

  • -50% VRAM usage.