Discover and explore top open-source AI tools and projects—updated daily.
adapter-hubUnified library for parameter-efficient transfer learning in NLP
Top 17.1% on SourcePulse
This library provides a unified interface for parameter-efficient and modular transfer learning, integrating over 10 adapter methods with more than 20 state-of-the-art Transformer models. It targets NLP researchers and practitioners seeking to efficiently fine-tune models and explore advanced techniques like adapter composition and quantization.
How It Works
Adapters acts as an add-on to HuggingFace's Transformers library, injecting lightweight adapter modules into existing Transformer architectures. This approach allows for fine-tuning models with significantly fewer trainable parameters compared to full fine-tuning. It supports advanced research through features like adapter merging via task arithmetic and composing multiple adapters using composition blocks.
Quick Start & Requirements
pip install -U adaptersHighlighted Details
Maintenance & Community
The library is actively maintained by the AdapterHub community. Further information and community channels can be found on their website.
Licensing & Compatibility
The library is compatible with HuggingFace Transformers. Licensing details are not explicitly stated in the README, but typically such libraries follow permissive licenses compatible with commercial use.
Limitations & Caveats
The library has replaced the adapter-transformers package, requiring users to transition their existing adapters. Specific licensing terms should be verified for commercial applications.
3 weeks ago
1 week
Leeroo-AI
Lightning-Universe
EleutherAI
THUDM
huggingface
ridgerchu
TencentARC
deepset-ai