Discover and explore top open-source AI tools and projects—updated daily.
Resource list for parameter-efficient transfer learning
Top 56.5% on SourcePulse
This repository is a curated collection of resources on Parameter-Efficient Transfer Learning (PEFT) for pre-trained vision models. It targets researchers and practitioners in computer vision who aim to achieve high performance with minimal parameter updates, offering a systematic review and categorization of existing PEFT methods.
How It Works
The project categorizes PEFT methods into three main groups: Addition-based Tuning, Partial-based Tuning, and Unified-based Tuning. This classification provides a structured overview of techniques that modify or add a small number of parameters to pre-trained models, enabling efficient adaptation to downstream tasks without the computational cost of full fine-tuning.
Quick Start & Requirements
This repository is a collection of papers and resources, not a runnable library. It requires no installation.
Highlighted Details
Maintenance & Community
The repository is marked as "Maintained? - yes". The latest commit was recent, indicating active curation.
Licensing & Compatibility
The repository itself does not have a specified license, but it links to numerous research papers and code repositories, each with its own licensing. Users must consult the licenses of individual linked projects.
Limitations & Caveats
This is a curated list of resources and does not provide a unified framework or benchmark for direct use. Users need to individually find, install, and integrate the PEFT methods from the linked code repositories.
2 months ago
1+ week