Mobile-first inference framework for neural networks
Top 1.9% on sourcepulse
ncnn is a high-performance neural network inference framework optimized for mobile platforms, designed for efficient deployment of deep learning models on devices. It targets developers building AI-powered mobile applications, offering significant speed advantages over other open-source frameworks on mobile CPUs.
How It Works
ncnn is a pure C++ implementation with no third-party dependencies, prioritizing minimal footprint and maximum performance. It achieves this through ARM NEON assembly-level optimizations, sophisticated memory management, and multi-core parallel processing. The framework supports GPU acceleration via Vulkan and offers extensibility for custom layers and model quantization.
Quick Start & Requirements
Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The platform support matrix indicates that while many platforms are supported, performance ("speed") may not be optimal for all configurations, particularly for certain GPU types on macOS and Windows. Some ARM-specific platforms are marked as "shall work, not confirmed."
1 day ago
1 week