Curated list of research papers on parameter-efficient tuning
Top 92.8% on sourcepulse
This repository serves as a curated collection of essential research papers on parameter-efficient tuning (Delta Tuning) methods for pre-trained models. It targets researchers and practitioners in NLP and machine learning who need to adapt large models efficiently, offering a structured overview of key techniques and their applications.
How It Works
The project compiles and categorizes seminal papers in parameter-efficient tuning, a field that addresses the prohibitive computational and storage costs of adapting massive pre-trained models. By focusing on methods that modify only a small fraction of parameters, Delta Tuning significantly reduces adaptation expenses, making large models more accessible. The collection highlights the practical benefits and potential theoretical implications of this approach.
Quick Start & Requirements
This is a curated list of papers and does not involve code execution.
Highlighted Details
Maintenance & Community
The project is maintained by the THU NLP group. Contributions are welcomed, with guidelines provided for adding new papers.
Licensing & Compatibility
The repository itself is likely under a permissive license, but the licensing of the individual papers referenced is determined by their respective publishers.
Limitations & Caveats
This resource is a bibliography and does not provide code for implementing the described methods. The rapid evolution of the field means the list may not be exhaustive or entirely up-to-date.
2 years ago
Inactive