Finetuning framework for mass concept erasure in diffusion models
Top 74.0% on sourcepulse
MACE (Mass Concept Erasure) is a framework designed to prevent large text-to-image diffusion models from generating harmful or misleading content by erasing specific concepts. It targets researchers and developers working with diffusion models who need to control or filter unwanted outputs, offering a scalable solution for removing up to 100 concepts simultaneously while maintaining model specificity.
How It Works
MACE employs a two-pronged approach: closed-form cross-attention refinement and LoRA (Low-Rank Adaptation) fine-tuning. The cross-attention refinement targets projection matrices within the U-Net's cross-attention blocks, using a closed-form solution to discourage the embedding of residual information from unwanted phrases into surrounding words. Concurrently, individual LoRA modules are trained for each concept to be removed, effectively eliminating its intrinsic information. A key innovation is the integration of multiple LoRAs without mutual interference and the prevention of catastrophic forgetting.
Quick Start & Requirements
conda create -n mace python=3.10
), and install PyTorch with CUDA 11.7.diffusers==0.22.0
, transformers==4.46.2
, huggingface_hub==0.25.2
.CUDA_VISIBLE_DEVICES=0 python training.py configs/object/erase_ship.yaml
CUDA_VISIBLE_DEVICES=0 python inference.py --model_path /path/to/saved_model/LoRA_fusion_model ...
Highlighted Details
Maintenance & Community
The project is the official implementation for a CVPR 2024 paper. No specific community channels (Discord/Slack) or active maintenance signals are mentioned in the README.
Licensing & Compatibility
The repository does not explicitly state a license. The underlying libraries used (e.g., Diffusers, GroundingDINO) have their own licenses, which may impact commercial use or closed-source linking.
Limitations & Caveats
The official Grounded-SAM installation is complex and resource-intensive (24GB+ GPU). The HuggingFace version requires even more RAM (>28GB). The project relies on specific versions of dependencies, which may require careful environment management.
2 months ago
1 day