Discover and explore top open-source AI tools and projects—updated daily.
AdaBit-AIParameter-efficient instruction tuning methods: an empirical study
Top 54.3% on SourcePulse
Parameter-efficient instruction tuning: an empirical study. This repository systematically compares various parameter-efficient fine-tuning (PEFT) methods on instruction tuning tasks using the SuperNI dataset. It targets researchers and engineers seeking efficient LLM adaptation, offering empirical insights to guide method selection and reduce computational costs.
How It Works
The project systematically evaluates various parameter-efficient fine-tuning (PEFT) techniques, adapting implementations from established libraries like adapter-transformers and peft. It employs the SuperNI dataset for instruction tuning benchmarks, focusing on empirical comparisons to identify optimal fine-tuning strategies for large language models, aiming to reduce computational cost and memory footprint during adaptation.
Quick Start & Requirements
peft-private (release-v0.4.0-adapter branch), cd peft-private, pip install -e ., pip install rouge-score.peft library requires torch>=1.13.0). GPT2 model required at cache/saved_pretrained/gpt2.hfai HPC (A100x8 GPUs). Default dataset path: ../../data.Highlighted Details
hfai HPC's pre-emptable environments.hp_run.sh scripts.Maintenance & Community
No specific community channels, roadmap, or contributor details are provided in the README. The project is associated with an arXiv publication.
Licensing & Compatibility
The repository's license is not specified in the README, posing a potential blocker for commercial use or integration.
Limitations & Caveats
The codebase is heavily optimized for the hfai HPC platform, potentially requiring significant adaptation for other environments due to specific configurations and dependencies. The use of a private peft repository and specific, older PyTorch version dependencies (1.10.2+cu113, despite peft requiring >=1.13.0) may complicate setup and integration. The absence of a specified license is a critical adoption blocker.
1 year ago
Inactive
hkust-nlp
PhoebusSi
allenai
huggingface