Sign In

Multi-GPU in ComfyUI (local, remote and cloud GPUs)

12

134

7

Type

Workflows

Stats

134

0

Reviews

Published

Aug 7, 2025

Base Model

Wan Video 14B i2v 720p

Hash

AutoV2
95889DB08E

ComfyUI Distributed Extension

I've been working on this extension to solve a problem that's frustrated me for months - having multiple GPUs but only being able to use one at a time in ComfyUI AND being user-friendly.


What it does:

  • Local workers: Use multiple GPUs in the same machine

  • Remote workers: Harness GPU power from other computers on your network

  • Cloud Workers: GPUs hosted on a cloud service like Runpod, accessible via secure tunnels

  • Parallel processing: Generate multiple variations simultaneously

  • Distributed upscaling: Split large upscale jobs across multiple GPUs


Using Cloud Workers?

Join Runpod with this link and unlock a special bonus: https://get.runpod.io/0bw29uf3ug0p


Real-world performance:

  • Ultimate SD Upscaler with 4 GPUs: before 23s -> after 7s

Easily convert any workflow:

  1. Add Distributed Seed node → connect to sampler

  2. Add Distributed Collector → after VAE decode

  3. Enable workers in the panel

  4. Watch all your GPUs finally work together!

Upscaling

  • Just replace the Ultimate SD Upscaler node with the Ultimate SD Upscaler Distributed node.

I've been using it across 2 machines (7 GPUs total) and it's been rock solid.


GitHub: https://github.com/robertvoy/ComfyUI-Distributed


📚 Resources:

🔗 WAN 2.2 Workflow

🔗 GitHub

📺 Watch "Deploy Cloud Worker on RunPod" Tutorial

📺 Watch the most recent update video


Happy to answer questions about setup or share more technical details!