DL4AGX  by NVIDIA

Deep learning toolkit for edge AI and autonomous vehicles

Created 6 years ago
261 stars

Top 97.5% on SourcePulse

GitHubView on GitHub
Project Summary

This repository offers deep learning model designs, deployment strategies, and inference samples specifically engineered for Autonomous Vehicle (AV) applications running on NVIDIA AGX hardware. It addresses the challenge of efficiently deploying complex, state-of-the-art neural networks in real-time AV environments, providing a pathway to leverage NVIDIA's specialized hardware acceleration.

How It Works

The project's methodology revolves around optimizing and deploying deep learning models using NVIDIA's TensorRT inference optimizer and runtime. It offers practical guidance on exporting models from training frameworks (e.g., via ONNX), applying crucial performance enhancements like explicit INT8 quantization and sparsity, and integrating a diverse range of advanced network architectures directly into the TensorRT ecosystem for maximum throughput and minimal latency on AGX platforms.

Quick Start & Requirements

This section cannot be populated as the provided README snippet lacks details on installation commands, specific hardware/software prerequisites (beyond NVIDIA AGX), or setup procedures.

Highlighted Details

  • Extensive TensorRT deployment solutions are featured for numerous AV-centric models, including perception architectures like BEVFormer, Far3D, PETRv1&v2, StreamPETR, and UniAD, as well as Large Language Models (LLMs) via TensorRT-LLM.
  • Covers essential optimization techniques such as ONNX model export, explicit INT8 quantization for reduced precision inference, and sparsity exploitation.
  • Includes novel and hardware-friendly model designs like DEST, ReduceFormer, and Swin-Free, tailored for edge deployment.
  • Supports the integration of advanced neural network components like Deformable Convolutions (DCNv4) and Multi-Task Multi-Instance learning (MTMI) within the TensorRT inference pipeline.

Maintenance & Community

No information regarding maintainers, community channels (e.g., Discord/Slack), roadmap, or project health signals is present in the provided text.

Licensing & Compatibility

The license type and any compatibility notes for commercial or closed-source use are not specified in the README snippet.

Limitations & Caveats

The repository's scope is strictly limited to NVIDIA AGX platforms and the TensorRT inference engine, indicating potential vendor lock-in and lack of cross-platform compatibility. The provided information does not include performance benchmarks, detailed setup requirements, or known limitations/bugs.

Health Check
Last Commit

4 months ago

Responsiveness

1+ week

Pull Requests (30d)
0
Issues (30d)
2
Star History
6 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.