Curated list of knowledge distillation papers (2014-2021)
Top 18.4% on sourcepulse
This repository is a curated collection of research papers on Knowledge Distillation (KD), spanning from 2014 to 2021. It serves as a comprehensive resource for researchers and practitioners in machine learning, particularly those interested in model compression, transfer learning, and improving model efficiency. The collection categorizes KD techniques by their approach, application, and specific methodologies.
How It Works
The repository organizes papers into distinct categories, such as "Knowledge from logits," "Knowledge from intermediate layers," "Graph-based," "Data-free KD," and applications in NLP, RecSys, and model pruning. This structured approach allows users to navigate the vast landscape of KD research and identify relevant papers based on their specific interests or problems.
Highlighted Details
Maintenance & Community
The repository is maintained by Yuang Liu (frankliu624@outlook.com) and acknowledges contributions from various individuals.
Licensing & Compatibility
The repository itself is a collection of links to research papers, primarily hosted on arXiv. The licensing and compatibility of individual papers depend on their respective publication venues and authors' choices.
Limitations & Caveats
This is a curated list of papers and does not provide any code implementation or framework for knowledge distillation itself. Users will need to refer to the individual papers for implementation details.
2 years ago
Inactive