Sign In

Z-Image [fp8]

Updated: Feb 8, 2026

base model

Verified:

SafeTensor

Type

Checkpoint Trained

Stats

185

0

Reviews

Published

Feb 6, 2026

Base Model

ZImageBase

Hash

AutoV2
9E5082DDF9

License:

fp8 quantized Z-Image for ComfyUI using its quantization feature "TensorCoreFP8Layout".

  • Scaled fp8 weights. higher precision than pure fp8.

  • Also with "mixed precision". Important layers remain in bf16.

There is no "official" fp8 version for z-image from ComfyUI, so I made my own.

All credit belongs to the original model author. License is the same as the original model.

FYI: many people might think that fp8 model has huge quality loss. That's because fp8 model saved by ComfyUI is not quantized. And many creators made their fp8 model in that way.

If you see creators complaining about the poor quality of fp8 models saved by ComfyUI, send them this link, or make your own quantized fp8 model from bf16.

https://github.com/silveroxides/ComfyUI-QuantOps

I just share the tool, I'm not using it. I'm using my own old script.


Base

Quantized Z-Image. Aka. the "base" version of z-image.

https://huggingface.co/Tongyi-MAI/Z-Image

Note: No hardware fp8, all calculations are still using bf16. This is intentional.

Rev 1.1: An updated version with better "mixed precision". More bf16 layers, so the file is bigger. Previous version will be deleted.


Turbo

Quantized Z-Image-Turbo

https://huggingface.co/Tongyi-MAI/Z-Image-Turbo

Rev1.1: An updated version with better "mixed precision". More bf16 layers, so the file is bigger. No hardware fp8. Previous version will be deleted.

v1: It contains calibrated metadata for hardware fp8 linear. If you GPU supports it, ComfyUI will use hardware fp8 automatically, which should be a little bit faster. More about hardware fp8 and hardware requirement, see ComfyUI TensorCoreFP8Layout.


Qwen3 4b

Quantized Qwen3 4b. Scaled fp8 + mixed precision. Early (embed_tokens, layers.[0-1]) and final (layers.[34-35]) layers are still in BF16.

https://huggingface.co/Qwen/Qwen3-4B