ComfyUI-Nvidia-Docker  by mmartial

ComfyUI Docker for NVIDIA GPUs

Created 1 year ago
286 stars

Top 91.7% on SourcePulse

GitHubView on GitHub
Project Summary

Summary

This Docker image provides a pre-configured, isolated environment for running ComfyUI on NVIDIA GPUs. It targets users needing robust host OS separation, simplified setup, and reliable GPU acceleration. Benefits include a permission-aware ComfyUI deployment optimized for NVIDIA hardware, with ComfyUI-Manager for easy updates.

How It Works

Leveraging official NVIDIA CUDA base images, it runs ComfyUI as a non-root user with configurable UID/GID mapping for host file permission compatibility. Distinct run (code, venv) and basedir (models, user data) volumes ensure modularity. ComfyUI-Manager is pre-installed, and the environment supports multiple Ubuntu/CUDA versions for driver compatibility. It also integrates the uv package manager for faster Python dependency installation.

Quick Start & Requirements

  • Install/Run: Use docker run or podman run with NVIDIA Container Toolkit, mounting run and basedir directories. Example commands are provided.
  • Prerequisites: NVIDIA drivers compatible with chosen CUDA version, NVIDIA Container Toolkit.
  • Resource Footprint: ~15GB for container/venv, excluding user data.
  • Docs: NVIDIA Docker/Podman setup: https://www.gkr.one/blg-20240523-u24-nvidia-docker-podman.

Highlighted Details

  • Version Flexibility: Numerous Ubuntu + CUDA tags (e.g., ubuntu24_cuda12.6.3, ubuntu24_cuda12.8) ensure broad NVIDIA driver compatibility.
  • User Permissions: Runs as non-root comfy user, mapping host UID/GID via WANTED_UID/WANTED_GID for correct file ownership.
  • Extensibility: Supports custom scripts (user_script.bash, /userscripts_dir) for advanced installations.
  • ComfyUI-Manager: Pre-installed for easy updates and management, with configurable security levels.
  • Directory Separation: Distinct run (code, venv) and basedir (models, user files) volumes enhance organization.
  • uv Support: Enables uv for faster Python package management (USE_UV=true).

Maintenance & Community

The primary maintainer is mmartial. No specific community channels or detailed contributor information are listed.

Licensing & Compatibility

  • License: Governed by the NVIDIA Deep Learning Container License.
  • Compatibility: NVIDIA GPUs, Docker, Podman, WSL2 on Windows, various Ubuntu/CUDA versions.

Limitations & Caveats

Switching OS+CUDA versions may cause custom node import failures. Using BASE_DIRECTORY with outdated ComfyUI can lead to errors. Intended for self-hosted models, not API nodes. podman-compose may not function on WSL2.

Health Check
Last Commit

1 day ago

Responsiveness

Inactive

Pull Requests (30d)
9
Issues (30d)
6
Star History
25 stars in the last 30 days

Explore Similar Projects

Starred by Andrej Karpathy Andrej Karpathy(Founder of Eureka Labs; Formerly at Tesla, OpenAI; Author of CS 231n), Chip Huyen Chip Huyen(Author of "AI Engineering", "Designing Machine Learning Systems"), and
4 more.

gpu.cpp by AnswerDotAI

0.1%
4k
C++ library for portable GPU computation using WebGPU
Created 1 year ago
Updated 3 months ago
Starred by Jeff Hammerbacher Jeff Hammerbacher(Cofounder of Cloudera), Stas Bekman Stas Bekman(Author of "Machine Learning Engineering Open Book"; Research Engineer at Snowflake), and
2 more.

gpustack by gpustack

0.9%
4k
GPU cluster manager for AI model deployment
Created 1 year ago
Updated 2 days ago
Feedback? Help us improve.