Discover and explore top open-source AI tools and projects—updated daily.
mmartialComfyUI Docker for NVIDIA GPUs
Top 91.7% on SourcePulse
Summary
This Docker image provides a pre-configured, isolated environment for running ComfyUI on NVIDIA GPUs. It targets users needing robust host OS separation, simplified setup, and reliable GPU acceleration. Benefits include a permission-aware ComfyUI deployment optimized for NVIDIA hardware, with ComfyUI-Manager for easy updates.
How It Works
Leveraging official NVIDIA CUDA base images, it runs ComfyUI as a non-root user with configurable UID/GID mapping for host file permission compatibility. Distinct run (code, venv) and basedir (models, user data) volumes ensure modularity. ComfyUI-Manager is pre-installed, and the environment supports multiple Ubuntu/CUDA versions for driver compatibility. It also integrates the uv package manager for faster Python dependency installation.
Quick Start & Requirements
docker run or podman run with NVIDIA Container Toolkit, mounting run and basedir directories. Example commands are provided.https://www.gkr.one/blg-20240523-u24-nvidia-docker-podman.Highlighted Details
ubuntu24_cuda12.6.3, ubuntu24_cuda12.8) ensure broad NVIDIA driver compatibility.comfy user, mapping host UID/GID via WANTED_UID/WANTED_GID for correct file ownership.user_script.bash, /userscripts_dir) for advanced installations.run (code, venv) and basedir (models, user files) volumes enhance organization.uv Support: Enables uv for faster Python package management (USE_UV=true).Maintenance & Community
The primary maintainer is mmartial. No specific community channels or detailed contributor information are listed.
Licensing & Compatibility
Limitations & Caveats
Switching OS+CUDA versions may cause custom node import failures. Using BASE_DIRECTORY with outdated ComfyUI can lead to errors. Intended for self-hosted models, not API nodes. podman-compose may not function on WSL2.
1 day ago
Inactive
AnswerDotAI
ai-dock
gpustack