consistency_models  by Kinyugo

Consistency models for fast synthesis and zero-shot editing

Created 2 years ago
251 stars

Top 99.8% on SourcePulse

GitHubView on GitHub
Project Summary

Consistency Models are a novel family of generative models offering high sample quality and fast, one-step generation without adversarial training. This mini-library provides tools for researchers and practitioners to train and infer with these models, enabling versatile applications like zero-shot data editing (inpainting, interpolation) and offering a trade-off between compute and sample quality via few-step sampling.

How It Works

The library implements training for Consistency Models, which learn to map noisy data to clean data across various noise levels. It supports two training paradigms: standard Consistency Training, which uses an Exponential Moving Average (EMA) of the student model, and an Improved Consistency Training method that omits EMA and utilizes a Pseudo-Huber loss. Inference leverages these trained models for rapid generation, with options for few-step sampling to balance compute and quality. Zero-shot editing tasks are handled by conditioning the sampling process.

Quick Start & Requirements

  • Installation: pip install -q -e git+https://github.com/Kinyugo/consistency_models.git#egg=consistency_models
  • Prerequisites: Requires PyTorch. GPU acceleration is highly recommended for practical training and inference.
  • Links: No specific quick-start guides or demos are linked beyond the GitHub repository.

Highlighted Details

  • Implements both standard and improved consistency training techniques.
  • Supports fast one-step generation and few-step sampling for quality/compute trade-offs.
  • Enables zero-shot data editing (inpainting, interpolation) without task-specific fine-tuning.
  • Designed to be modality-agnostic, applicable to various consistency model architectures.

Maintenance & Community

Community contributions are welcomed via pull requests and issues. No specific community channels (e.g., Discord, Slack) or notable maintainer information are provided.

Licensing & Compatibility

The repository's license is not explicitly stated in the README, which requires further investigation for commercial or closed-source integration.

Limitations & Caveats

The README highlights that the final_timesteps parameter in the timestep discretization schedule significantly impacts model performance, particularly on smaller datasets or shorter training runs. Achieving optimal results may require careful tuning of this parameter, which warrants further experimental investigation. Future work is planned for Consistency Distillation and Latent Consistency Models.

Health Check
Last Commit

2 years ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
0 stars in the last 30 days

Explore Similar Projects

Starred by Benjamin Bolte Benjamin Bolte(Cofounder of K-Scale Labs), Patrick von Platen Patrick von Platen(Author of Hugging Face Diffusers; Research Engineer at Mistral), and
10 more.

consistency_models by openai

0.0%
6k
PyTorch code for consistency models research paper
Created 3 years ago
Updated 1 year ago
Feedback? Help us improve.