knowledge-distillation-papers  by lhyfst

Paper list for knowledge distillation research

created 6 years ago
758 stars

Top 46.8% on sourcepulse

GitHubView on GitHub
Project Summary

This repository serves as a curated collection of academic papers on knowledge distillation, a technique for transferring knowledge from a larger, more complex model (teacher) to a smaller, more efficient model (student). It is a valuable resource for researchers and practitioners in machine learning, particularly those focused on model compression, efficiency, and performance optimization.

How It Works

The repository lists papers chronologically and by topic, providing a historical overview and a comprehensive survey of the field. It highlights key contributions, seminal works, and recent advancements in knowledge distillation techniques, covering various applications and methodologies.

Quick Start & Requirements

This repository is a collection of academic papers and does not require installation or specific software. The primary requirement is access to academic literature databases or search engines to retrieve the full papers.

Highlighted Details

  • Comprehensive list of papers spanning from 1997 to 2023.
  • Covers foundational works and recent advancements in knowledge distillation.
  • Includes papers on various applications like model compression, transfer learning, and adversarial robustness.
  • Mentions specific techniques such as FitNets, attention transfer, and data-free distillation.

Maintenance & Community

The repository is maintained by the user "lhyfst". Information on community engagement or active development is not explicitly provided in the README.

Licensing & Compatibility

The repository itself does not appear to have a specific license mentioned. The licensing of the individual papers would be governed by their respective publishers and copyright holders.

Limitations & Caveats

This repository is a curated list of papers and does not provide code implementations or direct access to the papers themselves. Users will need to find and access the papers through other academic channels.

Health Check
Last commit

2 years ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
5 stars in the last 90 days

Explore Similar Projects

Starred by Stas Bekman Stas Bekman(Author of Machine Learning Engineering Open Book; Research Engineer at Snowflake) and Andrey Vasnetsov Andrey Vasnetsov(Cofounder of Qdrant).

awesome-knowledge-distillation by dkozlov

0.1%
4k
Collection of knowledge distillation resources
created 8 years ago
updated 1 month ago
Feedback? Help us improve.