adapters  by adapter-hub

Unified library for parameter-efficient transfer learning in NLP

created 5 years ago
2,743 stars

Top 17.7% on sourcepulse

GitHubView on GitHub
Project Summary

This library provides a unified interface for parameter-efficient and modular transfer learning, integrating over 10 adapter methods with more than 20 state-of-the-art Transformer models. It targets NLP researchers and practitioners seeking to efficiently fine-tune models and explore advanced techniques like adapter composition and quantization.

How It Works

Adapters acts as an add-on to HuggingFace's Transformers library, injecting lightweight adapter modules into existing Transformer architectures. This approach allows for fine-tuning models with significantly fewer trainable parameters compared to full fine-tuning. It supports advanced research through features like adapter merging via task arithmetic and composing multiple adapters using composition blocks.

Quick Start & Requirements

  • Installation: pip install -U adapters
  • Prerequisites: Python 3.9+, PyTorch 2.0+.
  • Resources: Requires HuggingFace Transformers and PyTorch.
  • Documentation: https://docs.adapterhub.ml/

Highlighted Details

  • Supports over 10 adapter methods including LoRA, QLoRA, Prefix Tuning, and AdapterFusion.
  • Integrates with 20+ Transformer models from HuggingFace.
  • Enables adapter merging and composition for complex transfer learning scenarios.
  • Offers support for quantized training methods like Q-LoRA.

Maintenance & Community

The library is actively maintained by the AdapterHub community. Further information and community channels can be found on their website.

Licensing & Compatibility

The library is compatible with HuggingFace Transformers. Licensing details are not explicitly stated in the README, but typically such libraries follow permissive licenses compatible with commercial use.

Limitations & Caveats

The library has replaced the adapter-transformers package, requiring users to transition their existing adapters. Specific licensing terms should be verified for commercial applications.

Health Check
Last commit

2 months ago

Responsiveness

Inactive

Pull Requests (30d)
1
Issues (30d)
2
Star History
47 stars in the last 90 days

Explore Similar Projects

Starred by Jeremy Howard Jeremy Howard(Cofounder of fast.ai) and Stas Bekman Stas Bekman(Author of Machine Learning Engineering Open Book; Research Engineer at Snowflake).

SwissArmyTransformer by THUDM

0.3%
1k
Transformer library for flexible model development
created 3 years ago
updated 7 months ago
Starred by Chip Huyen Chip Huyen(Author of AI Engineering, Designing Machine Learning Systems), Jeff Hammerbacher Jeff Hammerbacher(Cofounder of Cloudera), and
6 more.

x-transformers by lucidrains

0.2%
5k
Transformer library with extensive experimental features
created 4 years ago
updated 3 days ago
Starred by Lilian Weng Lilian Weng(Cofounder of Thinking Machines Lab), Andrej Karpathy Andrej Karpathy(Founder of Eureka Labs; Formerly at Tesla, OpenAI; Author of CS 231n), and
42 more.

transformers by huggingface

0.2%
148k
ML library for pretrained model inference and training
created 6 years ago
updated 14 hours ago
Feedback? Help us improve.