Collection of research papers on deep learning and computer architecture
Top 22.6% on sourcepulse
This repository is a curated collection of research papers and academic contributions focused on neural network accelerators and deep learning hardware architecture. It serves as a valuable resource for researchers, engineers, and students interested in the cutting-edge advancements in AI chip design, computer architecture, and hardware-software co-design for machine learning.
How It Works
The project compiles significant academic papers, primarily from top-tier conferences and journals, that introduce novel architectures, algorithms, and methodologies for accelerating neural networks. It categorizes these contributions by year and conference, providing a structured overview of the field's evolution. The descriptions often highlight key innovations such as new dataflows, processing-in-memory (PIM) techniques, quantization strategies, and specialized hardware designs.
Quick Start & Requirements
This repository is a collection of research papers and does not have a direct installation or execution command. Accessing the content requires obtaining the individual papers, which may be available through academic databases or publisher websites.
Highlighted Details
Maintenance & Community
The repository is maintained by Fengbin Tu, an Assistant Professor at The Hong Kong University of Science and Technology, with a research focus on AI chips and systems. The project is presented as a personal selection of research, welcoming contributions and collaboration.
Licensing & Compatibility
The repository itself does not specify a license. The content consists of links and references to academic papers, each with its own copyright and licensing terms as determined by the respective publishers and authors.
Limitations & Caveats
This repository is a curated list of papers and does not provide any executable code, simulators, or direct access to the full text of the papers. Users will need to find and access the papers independently through academic channels. The breadth of coverage means that individual paper details are brief summaries.
2 weeks ago
1+ week