Efficient-Deep-Learning  by MingSun-Tse

DNN efficiency methods collection (neural compression, acceleration)

created 6 years ago
948 stars

Top 39.5% on sourcepulse

GitHubView on GitHub
Project Summary

This repository serves as a comprehensive, curated collection of research papers and resources focused on efficient deep learning, specifically targeting neural network compression and acceleration techniques. It is primarily aimed at researchers and engineers in the field of deep learning who are looking to understand and implement methods for making models smaller, faster, and more resource-efficient.

How It Works

The repository categorizes papers into key areas: pruning (including Lottery Ticket Hypothesis and pruning at initialization), quantization, and knowledge distillation. It also provides links to related topics like Neural Architecture Search (NAS) and interpretability. The collection is structured chronologically and by sub-topic, offering a historical overview and a deep dive into specific methodologies.

Quick Start & Requirements

This repository is a collection of papers and does not have a direct installation or execution command. It serves as a reference guide.

Highlighted Details

  • Extensive lists of papers covering pruning, quantization, and knowledge distillation from major conferences (ICLR, CVPR, ICML, NIPS, etc.) and journals.
  • Includes links to code implementations, PyTorch/Caffe re-implementations, and related repositories for many papers.
  • Features curated lists for specific areas like "Awesome-Efficient-ViT" and "Pruning at Initialization."
  • Provides links to relevant books, surveys, and tutorials on efficient deep learning.

Maintenance & Community

The repository is maintained by MingSun-Tse. It welcomes pull requests for adding pertinent papers, indicating an active community contribution model. Links to related "Awesome" lists and specific workshops suggest a connection to broader research communities.

Licensing & Compatibility

The repository itself does not specify a license, but it is a collection of links to research papers, each with its own licensing and usage terms. Compatibility for commercial use would depend on the individual papers and their associated code.

Limitations & Caveats

As a curated list of papers, this repository does not provide executable code or implementations itself. Users must refer to the individual papers for implementation details and potential dependencies. The sheer volume of papers may require significant effort to navigate and synthesize.

Health Check
Last commit

4 months ago

Responsiveness

1 week

Pull Requests (30d)
0
Issues (30d)
0
Star History
7 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.