ghost
Sign In

Flux1 Dev Consistent Character - Fast Generation (PuLID, Controlnet, Turbo Alpha Lora, Torch Compile)

20
243
7
Updated: Oct 26, 2024
toolflux1.d
Type
Workflows
Stats
118
Reviews
Published
Oct 26, 2024
Base Model
Flux.1 D
Hash
AutoV2
CF1CAD4F48
The FLUX.1 [dev] Model is licensed by Black Forest Labs. Inc. under the FLUX.1 [dev] Non-Commercial License. Copyright Black Forest Labs. Inc.
IN NO EVENT SHALL BLACK FOREST LABS, INC. BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH USE OF THIS MODEL.

This workflow generates a 1024 x 1024 flux image within 16s on a 4060Ti. You will need 16GB VRAM for this workflow. It uses the fp8 flux models because the gguf modes do not work well with torch.compile. Please setup the environment as per the instructions below if you are getting errors.

  • PuLID for character consistency

  • FLUX.1-dev-ControlNet-Union-Pro controlnet (need to use Q2_k flux gguf for controlnet on 16GB VRAM)

  • The Turbo-Alpha LoRA by alimama-creative is used to reduce the number of steps required to 8.

  • TorchCompileModel beta node is used to compile the model, this will increase the time required for the first generation but speed up subsequent generations.

  • weight_dtype is selected to be fp8_e4m3fn_fast to take advantage of fp8 matrix multiplication for faster generation on Nvidia 40 series graphics card. If you are on an older card, select fp8_e4m3fn.

Environment

I have tested this workflow using the exact environment below, so if you are having troubles, try setting up the environment using the following steps:

  1. Download and install Python 3.10.11 (https://www.python.org/downloads/)

  2. Download and install Cuda 12.4 (https://developer.nvidia.com/cuda-12-4-0-download-archive) and cuDNN Version Tarball, CUDA Version 12 (https://developer.nvidia.com/cudnn-downloads). cuDNN is required for nodes like ReActor that use Onnx, this is why I have it installed anyway.

  3. Clone the latest comfyui (v0.2.4 as of publish of this workflow) using: git clone https://github.com/comfyanonymous/ComfyUI.git

  4. Change directory into the comfyui directory

  5. python -m venv .venv

  6. .venv\Scripts\activate.bat

  7. pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu124

  8. install comfyui requirements: pip install -r requirements.txt

  9. Download triton 3.1.0 for python 3.10 (triton-3.1.0-cp310-cp310-win_amd64.whl

    ) from here: https://github.com/woct0rdho/triton-windows/releases into your comfyui folder and install using pip install triton-3.1.0-cp310-cp310-win_amd64.whl

  10. Run comfyui using: python.exe -s .\main.py --windows-standalone-build

Models

  1. flux1-dev-fp8-e4m3fb.safetensors (models/diffusion_models/flux): https://huggingface.co/Kijai/flux-fp8/tree/main

  2. t5xxl_fp8_e4m3fn_scaled.safetensors (models/clip): https://huggingface.co/comfyanonymous/flux_text_encoders/tree/main

  3. ViT-L-14-BEST-smooth-GmP-TE-only-HF-format.safetensors (models/clip): https://huggingface.co/zer0int/CLIP-GmP-ViT-L-14/tree/main

  4. pulid_flux_v0.9.0.safetensors (models/pulid): https://huggingface.co/guozinan/PuLID/blob/main/pulid_flux_v0.9.0.safetensors

  5. antelopeV2 (models/insightface/models/antelopev2): download antelopev2.zip from https://huggingface.co/MonsterMMORPG/tools/tree/main and unzip. You should have all the antelopeV2 model files in the models/insightface/models/antelopev2 folder.

  6. FLUX.1-dev-ControlNet-Union-Pro (models/controlnet): https://huggingface.co/Shakker-Labs/FLUX.1-dev-ControlNet-Union-Pro/tree/main download diffusion_pytorch_model.safetensors and rename to FLUX.1-dev-ControlNet-Union-Pro.safetensors

  7. FLUX.1-Turbo-Alpha (models/loras/flux): https://huggingface.co/alimama-creative/FLUX.1-Turbo-Alpha download diffusion_pytorch_model.safetensors and rename to FLUX.1-Turbo-Alpha.safetensors

  8. Flux VAE: install with manager

Custom Nodes

  1. ComfyUI-GGUF: only for 16GB VRAM users who want to use controlnet

  2. rgthree's ComfyUI Nodes

  3. KJNodes for ComfyUI

  4. ComfyUI-PuLID-Flux-Enhanced