tvm_mlir_learn  by BBuf

Compiler learning resources collection

Created 5 years ago
2,679 stars

Top 17.3% on SourcePulse

GitHubView on GitHub
Project Summary

This repository serves as a curated collection of learning resources for AI compilers, focusing on TVM and MLIR. It targets engineers and researchers interested in understanding and optimizing deep learning models through compiler techniques, offering practical examples, paper readings, and video tutorials.

How It Works

The project organizes learning materials into distinct categories: scheduler (TVM scheduling examples), dataflow_controlflow (distinguishing data and control flow), paper_reading (compiler papers like PET, Ansor, MLIR), relay (TVM Relay IR examples), codegen (TVM code generation), and torchscript (PyTorch TorchScript usage). It also includes practical scripts for compiling TVM in Docker, running ResNet18 inference with TVM via ONNX and PyTorch, and exporting ONNX models.

Quick Start & Requirements

  • Install: Primarily uses Python packages. Specific examples may require TVM installation (e.g., pip install apache-tvm).
  • Prerequisites: Python, TVM (specific versions may be needed for examples), PyTorch, ONNX. Some examples might benefit from CUDA-enabled GPUs for performance testing.
  • Resources: Setup involves installing Python packages and potentially compiling TVM. Resource requirements vary by example, with some demonstrating performance on specific hardware like Jetson Nano or x86 CPUs.
  • Links:

Highlighted Details

  • Comprehensive collection of video tutorials and translated content on TVM, MLIR, and LLVM from various sources.
  • Practical examples demonstrating TVM's capabilities, including custom passes, cross-compilation, and optimizing GEMM operations.
  • Includes guides on exporting PyTorch models to ONNX and running them with TVM.
  • Features paper reading notes on key AI compiler research topics like Ansor and PET.

Maintenance & Community

  • The repository is maintained by BBuf.
  • Links to related learning repositories by the author are provided.
  • Community engagement is encouraged via GitHub stars.

Licensing & Compatibility

  • The repository itself does not explicitly state a license in the README. Individual components or linked resources may have their own licenses.

Limitations & Caveats

  • The project is a collection of learning resources rather than a single, runnable tool, meaning direct execution depends on individual script requirements and setup.
  • Some examples might be tied to specific versions of TVM or other dependencies, potentially requiring adjustments for compatibility.
Health Check
Last Commit

11 months ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
18 stars in the last 30 days

Explore Similar Projects

Starred by Stas Bekman Stas Bekman(Author of "Machine Learning Engineering Open Book"; Research Engineer at Snowflake) and Thomas Wolf Thomas Wolf(Cofounder of Hugging Face).

transformer by sannykim

0%
564
Resource list for studying Transformers
Created 6 years ago
Updated 2 years ago
Starred by Théophile Gervet Théophile Gervet(Cofounder of Genesis AI), Jason Knight Jason Knight(Director AI Compilers at NVIDIA; Cofounder of OctoML), and
7 more.

lingua by facebookresearch

0.0%
5k
LLM research codebase for training and inference
Created 1 year ago
Updated 7 months ago
Feedback? Help us improve.