Multi-GPU in ComfyUI (local, remote and cloud GPUs)
Details
Download Files
Model description
ComfyUI Distributed Extension
I've been working on this extension to solve a problem that's frustrated me for months - having multiple GPUs but only being able to use one at a time in ComfyUI AND being user-friendly.
What it does:
Local workers: Use multiple GPUs in the same machine
Remote workers: Harness GPU power from other computers on your network
Cloud Workers: GPUs hosted on a cloud service like Runpod, accessible via secure tunnels
Parallel processing: Generate multiple variations simultaneously
Distributed upscaling: Split large upscale jobs across multiple GPUs
Using Cloud Workers?
Join Runpod with this link and unlock a special bonus: https://get.runpod.io/0bw29uf3ug0p
Real-world performance:
- Ultimate SD Upscaler with 4 GPUs: before 23s -> after 7s
Easily convert any workflow:
Add Distributed Seed node → connect to sampler
Add Distributed Collector → after VAE decode
Enable workers in the panel
Watch all your GPUs finally work together!
Upscaling
- Just replace the Ultimate SD Upscaler node with the Ultimate SD Upscaler Distributed node.
I've been using it across 2 machines (7 GPUs total) and it's been rock solid.
GitHub: https://github.com/robertvoy/ComfyUI-Distributed
📚 Resources:
🔗 GitHub
📺 Watch "Deploy Cloud Worker on RunPod" Tutorial
📺 Watch the most recent update video
Happy to answer questions about setup or share more technical details!
