Sign In

(flux1 bnb nf4) SimpleworkFlow

22
481
5
Updated: Sep 5, 2024
tool
Type
Workflows
Stats
81
Reviews
Published
Sep 5, 2024
Base Model
Flux.1 D
Hash
AutoV2
AA6FC75626
The FLUX.1 [dev] Model is licensed by Black Forest Labs. Inc. under the FLUX.1 [dev] Non-Commercial License. Copyright Black Forest Labs. Inc.
IN NO EVENT SHALL BLACK FOREST LABS, INC. BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH USE OF THIS MODEL.

NF4 is significantly faster and more memory-efficient than FP8 due to its use of native bnb.matmul_4bit, which avoids casting and leverages low-bit CUDA tricks. It achieves better numerical precision and dynamic range by storing weights in multiple tensors of varying precisions, unlike FP8's single-tensor approach

the list of all (nf4) models (lora not supported yet)

https://huggingface.co/silveroxides/flux1-nf4-weights/tree/main

what you have to do just

go to ComfyUI/custom_nodes/

then (for full checkpoint)

https://github.com/comfyanonymous/ComfyUI_bitsandbytes_NF4

  • git clone https://github.com/comfyanonymous/ComfyUI_bitsandbytes_NF4

or for unet only

  • git clone https://github.com/DenkingOfficial/ComfyUI_UNet_bitsandbytes_NF4.git

and then run this workflow in comfyui

"more advance" workflow "just disable checkpoint and activate NF4 checkpoint loader"

for flux resolution

in custom node , or cancel it

https://github.com/al-swaiti/ComfyUI-OllamaGemini