Distillation method for fast, high-quality image generation
Top 81.1% on sourcepulse
This repository provides the official implementation for Trajectory Consistency Distillation (TCD), a novel distillation technology designed to accelerate pre-trained diffusion models for few-step image generation. It offers a flexible alternative to existing acceleration methods like LCM-LoRA, aiming for superior quality and versatility across various diffusion model backbones and community adaptations.
How It Works
TCD is inspired by Consistency Models and leverages exponential integrators to design an effective consistency function. This approach allows for flexible Number of Function Evaluations (NFEs) without the significant quality degradation seen in other methods at higher NFEs. Unlike adversarial distillation methods, TCD avoids mode collapse and "Janus" artifacts, producing more realistic and diverse outputs.
Quick Start & Requirements
pip install diffusers transformers accelerate peft
python gradio_app.py
), and a Colab demo is provided.Highlighted Details
Maintenance & Community
The project has seen recent integration into the Diffusers library and has released LoRA checkpoints for SDv1.5 and SDv2.1. A ComfyUI plugin is also available. The project acknowledges contributions from the Diffusers team and community members.
Licensing & Compatibility
The repository's license is not explicitly stated in the README. However, its heavy reliance on the Hugging Face Diffusers library suggests compatibility with its Apache 2.0 license. Commercial use is likely permitted, but verification of the specific license for the TCD code and models is recommended.
Limitations & Caveats
The README prominently features a statement addressing plagiarism allegations from the CTM team, detailing communication and differences between the methods. While the project claims superior performance, direct comparative benchmarks against all concurrent works are not provided.
1 year ago
1 week