Skip to content

YanWenKun/ComfyUI-Docker

Repository files navigation

Docker images for ComfyUI

This repo is for Docker images that runs ComfyUI - a Stable Diffusion GUI powering node-based workflow.

Quick Start - NVIDIA GPU

The supported CUDA versions for each GPU architecture are shown in the table below:

GPU Architecture Blackwell Hopper Ada Lovelace Ampere Turing Volta Pascal Maxwell

Example GPU

RTX 5090

H20

RTX 4090

RTX 3090

RTX 2080

TITAN V

GTX 1080

GTX 980

cu128

✔️

✔️

✔️

✔️

✔️

cu126

✔️

✔️

✔️

✔️

✔️

✔️

✔️

cu118

✔️

✔️

✔️

✔️

✔️

✔️

"Supported" in the table refers to the GPU/CUDA combination officially supported by PyTorch, which may differ from NVIDIA’s official CUDA support. For details, see the PyTorch related code and this article .

CUDA 12.6: From Maxwell (GTX 9xx) to Ada Lovelace (RTX 40xx)

mkdir -p storage

docker run -it --rm \
  --name comfyui-cu126 \
  --gpus all \
  -p 8188:8188 \
  -v "$(pwd)"/storage:/root \
  -e CLI_ARGS="" \
  yanwk/comfyui-boot:cu126-slim

CUDA 12.8: From Turing (RTX 20xx / GTX 16xx) to Blackwell (RTX 50xx)

mkdir -p storage

docker run -it --rm \
  --name comfyui-cu128 \
  --gpus all \
  -p 8188:8188 \
  -v "$(pwd)"/storage:/root \
  -e CLI_ARGS="--fast" \
  yanwk/comfyui-boot:cu128-slim

More image tags

Note: This repository is currently under heavy development. The next two mainly supported image families will be cu126 and cu129.

cu121 and cu124 images will be archived soon, as they are no longer supported by PyTorch.

cu118 will stay for a while for compatibility usage (such as classic SD 1.5 on older hardware), but will not receive any updates.

cu128 will stop updating after cu129 is released.

  • rocm

    • For AMD GPUs with ROCm.

  • xpu

    • For Intel GPUs with XPU.

  • nightly

    • Using preview version of PyTorch (CUDA).

  • cu121

    • Easy for ComfyUI beginners. Starts with ComfyUI, ComfyUI-Manager and the Photon (SD1.5) model.

    • Using a low-privilege user within the container (easy for WSL2 deploy).

    • Not recommended for Podman or rootless deploy (use any image below instead).

    • Using CUDA 12.1 + Python 3.11.

  • cu124-slim

    • Similar to cu121, equipped with many dependencies, starts with ComfyUI and ComfyUI-Manager only.

    • Downloads less. No SD model included.

    • Using 'root' user within the container (easy for rootless deploy).

    • Using CUDA 12.4 + Python 3.12.

  • cu121-megapak

    • All-in-one bundle, including dev kits.

    • Using CUDA 12.1 + Python 3.11.

  • cu124-megapak

    • All-in-one bundle, including dev kits.

    • Using CUDA 12.4 + Python 3.12.

  • cu124-cn

    • For users in mainland China. Using mirror sites for all download links.

  • comfy3d-pt25

License

This open source license is written and valid both in Chinese and English, how good is that!

About

🐳Dockerfile for 🎨ComfyUI. | 容器镜像与启动脚本

Topics

Resources

License

Stars

Watchers

Forks

Sponsor this project