Discover and explore top open-source AI tools and projects—updated daily.
welltop-cnComfyUI extension for inference speedup
Top 36.3% on SourcePulse
This repository provides ComfyUI nodes for ComfyUI-TeaCache, a training-free caching method designed to accelerate inference for diffusion models by leveraging timestep-specific output differences. It targets users of ComfyUI working with image, video, and audio diffusion models, offering significant speedups with minimal quality degradation.
How It Works
TeaCache estimates and caches fluctuating differences in model outputs across diffusion timesteps. This approach avoids recomputing redundant information, leading to faster inference. The integration into ComfyUI is seamless, requiring only node connections within existing workflows.
Quick Start & Requirements
custom_nodes and running pip install -r requirements.txt.TeaCache node after model loading nodes. Recommended rel_l1_thresh and max_skip_steps values are provided for various models.examples folder.Highlighted Details
Compile Model node leveraging torch.compile for further inference acceleration.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
Compile Model node can be time-consuming.4 months ago
1 day
hao-ai-lab