Awesome-Knowledge-Distillation  by FLHonker

Curated list of knowledge distillation papers (2014-2021)

created 5 years ago
2,614 stars

Top 18.4% on sourcepulse

GitHubView on GitHub
Project Summary

This repository is a curated collection of research papers on Knowledge Distillation (KD), spanning from 2014 to 2021. It serves as a comprehensive resource for researchers and practitioners in machine learning, particularly those interested in model compression, transfer learning, and improving model efficiency. The collection categorizes KD techniques by their approach, application, and specific methodologies.

How It Works

The repository organizes papers into distinct categories, such as "Knowledge from logits," "Knowledge from intermediate layers," "Graph-based," "Data-free KD," and applications in NLP, RecSys, and model pruning. This structured approach allows users to navigate the vast landscape of KD research and identify relevant papers based on their specific interests or problems.

Highlighted Details

  • Extensive coverage of KD techniques from 2014-2021.
  • Categorization includes diverse areas like GANs, Meta-learning, AutoML, RL, and Self-supervised learning.
  • Papers are organized by knowledge source (logits, intermediate layers, structural) and application domains.
  • Includes links to code for many papers, facilitating practical implementation.

Maintenance & Community

The repository is maintained by Yuang Liu (frankliu624@outlook.com) and acknowledges contributions from various individuals.

Licensing & Compatibility

The repository itself is a collection of links to research papers, primarily hosted on arXiv. The licensing and compatibility of individual papers depend on their respective publication venues and authors' choices.

Limitations & Caveats

This is a curated list of papers and does not provide any code implementation or framework for knowledge distillation itself. Users will need to refer to the individual papers for implementation details.

Health Check
Last commit

2 years ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
43 stars in the last 90 days

Explore Similar Projects

Starred by Stas Bekman Stas Bekman(Author of Machine Learning Engineering Open Book; Research Engineer at Snowflake) and Andrey Vasnetsov Andrey Vasnetsov(Cofounder of Qdrant).

awesome-knowledge-distillation by dkozlov

0.1%
4k
Collection of knowledge distillation resources
created 8 years ago
updated 1 month ago
Feedback? Help us improve.