Sign In

Cascade Native on ComfyUI and A1111 extension +Ollama and Proteus

8

Cascade Native on ComfyUI and A1111 extension +Ollama and Proteus

Update comfy UI and download the models bellow from the manager just search for Cascade I leave the size pipelines at the end of the article

My extension
https://github.com/if-ai/IF_prompt_MKR

Cascade extension A1111 -Forge
https://github.com/blue-pen5805/sdweb-easy-stablecascade-diffusers

ollama
https://ollama.com/blog/windows-preview
https://github.com/ollama/ollama

Ollama WEBUI
https://github.com/ollama-webui/ollama-webui

ollama Models
https://ollama.com/impactframes/mistral_alpha_xs
https://ollama.com/impactframes/stable_diffusion_prompt_maker
https://ollama.com/brxce/stable-diffusion-prompt-generator

Satable-Cascade
https://huggingface.co/stabilityai/stable-cascade

ProteusV0.3
https://huggingface.co/dataautogpt3/ProteusV0.3

ConfyUI WF
https://gist.github.com/comfyanonymous/0f09119a342d0dd825bb2d99d19b781c#file-stable_cascade_workflow_test-json

Discord Banodoco
https://discord.gg/7UyNtVXB

VRAM for ComfyUI give or take

full
stage_a.safetensors, stage_b.safetensors, stage_c.safetensors and text encoder model.safetensors = 20GB+

bfp16
stage_a.safetensors, stage_b_bf16.safetensors, stage_c_bf16.safetensors and text encoder model.safetensors = 14GB

lite
stage_a.safetensors, stage_b_lite.safetensors, stage_c_lite.safetensors and text encoder model.safetensors = 8GB

lite_bf16
stage_a.safetensors, stage_b_lite_bf16.safetensors, stage_c_lite_bf16.safetensors, and text encoder model.safetensors = 6GB

SD A1111 and Forge extension
16GB VRAM

8

Comments