Local AI API platform for robots, optional cloud
Top 17.6% on sourcepulse
Cortex.cpp is a local AI API platform designed for robotics and AI applications, enabling users to run various AI models (vision, speech, language, tabular, action) without cloud dependency. It targets developers and researchers seeking an efficient, self-contained AI inference solution.
How It Works
Cortex.cpp utilizes a multi-engine architecture, initially supporting llama.cpp, with extensibility for custom engines. It features automatic hardware acceleration for NVIDIA, AMD, and Intel GPUs, optimizing inference performance. The platform exposes an OpenAI-compatible API, simplifying integration with existing tools and workflows.
Quick Start & Requirements
curl -s https://raw.githubusercontent.com/menloresearch/cortex/main/engine/templates/linux/install.sh | sudo bash
for other Linux distributions.cortex start
, pull models with cortex pull <model_name>
, and run models with cortex run <model_name>
.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The project is under active development, indicating potential for rapid changes and breaking updates. Specific licensing details for commercial use are not provided.
4 weeks ago
1 week