Paper list for parameter-efficient transfer learning in CV/multimodal
Top 73.2% on sourcepulse
This repository is a curated collection of research papers on Parameter-Efficient Transfer Learning (PETL) for computer vision and multimodal domains. It serves as a valuable resource for researchers and practitioners looking to adapt large pre-trained models efficiently, addressing challenges like overfitting and high computational costs associated with full fine-tuning.
How It Works
The collection categorizes PETL methods into "Prompt Learning" and "Adapter" techniques, with an "Others" section for related approaches. Prompt learning involves adding task-specific prompts to the input or model architecture, while adapters insert small, trainable modules into the pre-trained model. This approach allows for significant adaptation with minimal parameter updates, preserving the knowledge of the large pre-trained model.
Quick Start & Requirements
This is a curated list of papers, not a software library. No installation or execution is required. Links to papers and code are provided for each entry.
Highlighted Details
Maintenance & Community
The repository structure and contribution guidelines are inspired by thunlp/DeltaPapers
. Contributions are welcomed following a specified format for new paper entries.
Licensing & Compatibility
The repository itself does not have a specific license mentioned, but it links to research papers, each with its own licensing and usage terms. Compatibility for commercial or closed-source use depends on the licenses of the individual papers and their associated code.
Limitations & Caveats
This is a reference list and does not provide runnable code or benchmarks. Users must consult the individual papers for implementation details, performance claims, and specific requirements.
10 months ago
1 week