it works, flux1d with dbe in 6.5gb of ram
Loading Images
flux1 dbe on M2 Mac 24GB ram, comfyui used 6GB.
Get a gguf model, like:
https://civitai.com/models/647237?modelVersionId=725532
I am using 4.1 comfyui uses about 6GB of ram, down from 40+, each image takes about 40 minutes to generate.
Put the model file in the unet folder.
I used comfyui manager to add several modules the used the gguf keyword, one of them must have worked.
Use the gguf loader to load the model: nodes, gguf, gguf loader
also use the gguf dual clip loader.
I read somewhere that the F8 models need M3 or later. T5/t5xxl_fp16.safetensors worked for me.
also:
To get comfy to work, after running installer, cd to the directory then run:
python -m pip install -r requirements.txt