Collection of knowledge distillation resources
Top 13.3% on sourcepulse
This repository is a curated list of papers, implementations, and resources related to Knowledge Distillation (KD) in machine learning. It serves as a comprehensive reference for researchers and practitioners interested in model compression, knowledge transfer, and improving the efficiency of neural networks by training smaller "student" models to mimic larger "teacher" models.
How It Works
The repository organizes a vast collection of academic papers, categorizing them by year and topic. It also links to various open-source implementations of KD techniques across different deep learning frameworks like PyTorch, TensorFlow, MXNet, and Caffe. This allows users to explore the theoretical foundations and practical applications of KD methods, from foundational concepts like "dark knowledge" to advanced techniques such as contrastive distillation and data-free distillation.
Highlighted Details
Maintenance & Community
This is a community-driven "awesome" list, meaning it is maintained by contributions from users. There are no specific contributors or community channels explicitly mentioned in the README.
Licensing & Compatibility
The repository itself is a list of links and does not have a specific license. The linked papers and code repositories are subject to their respective licenses.
Limitations & Caveats
As a curated list, the repository's content is dependent on community contributions and may not be exhaustive or perfectly up-to-date. The quality and maintenance status of linked code implementations can vary.
1 month ago
1 day