adapters  by adapter-hub

Unified library for parameter-efficient transfer learning in NLP

Created 5 years ago
2,795 stars

Top 16.9% on SourcePulse

GitHubView on GitHub
Project Summary

This library provides a unified interface for parameter-efficient and modular transfer learning, integrating over 10 adapter methods with more than 20 state-of-the-art Transformer models. It targets NLP researchers and practitioners seeking to efficiently fine-tune models and explore advanced techniques like adapter composition and quantization.

How It Works

Adapters acts as an add-on to HuggingFace's Transformers library, injecting lightweight adapter modules into existing Transformer architectures. This approach allows for fine-tuning models with significantly fewer trainable parameters compared to full fine-tuning. It supports advanced research through features like adapter merging via task arithmetic and composing multiple adapters using composition blocks.

Quick Start & Requirements

  • Installation: pip install -U adapters
  • Prerequisites: Python 3.9+, PyTorch 2.0+.
  • Resources: Requires HuggingFace Transformers and PyTorch.
  • Documentation: https://docs.adapterhub.ml/

Highlighted Details

  • Supports over 10 adapter methods including LoRA, QLoRA, Prefix Tuning, and AdapterFusion.
  • Integrates with 20+ Transformer models from HuggingFace.
  • Enables adapter merging and composition for complex transfer learning scenarios.
  • Offers support for quantized training methods like Q-LoRA.

Maintenance & Community

The library is actively maintained by the AdapterHub community. Further information and community channels can be found on their website.

Licensing & Compatibility

The library is compatible with HuggingFace Transformers. Licensing details are not explicitly stated in the README, but typically such libraries follow permissive licenses compatible with commercial use.

Limitations & Caveats

The library has replaced the adapter-transformers package, requiring users to transition their existing adapters. Specific licensing terms should be verified for commercial applications.

Health Check
Last Commit

3 months ago

Responsiveness

1 week

Pull Requests (30d)
0
Issues (30d)
0
Star History
8 stars in the last 30 days

Explore Similar Projects

Starred by Jeremy Howard Jeremy Howard(Cofounder of fast.ai) and Stas Bekman Stas Bekman(Author of "Machine Learning Engineering Open Book"; Research Engineer at Snowflake).

SwissArmyTransformer by THUDM

0.1%
1k
Transformer library for flexible model development
Created 4 years ago
Updated 1 year ago
Starred by Tobi Lutke Tobi Lutke(Cofounder of Shopify), Andrej Karpathy Andrej Karpathy(Founder of Eureka Labs; Formerly at Tesla, OpenAI; Author of CS 231n), and
5 more.

matmulfreellm by ridgerchu

0.1%
3k
MatMul-free language models
Created 1 year ago
Updated 1 month ago
Feedback? Help us improve.