Self-hostable AI agent platform for local, private AI automation
Top 39.8% on sourcepulse
LocalAGI provides a self-hostable AI agent platform for maximum privacy and flexibility, acting as a drop-in replacement for OpenAI's API with advanced agentic capabilities. It targets users who want to run AI locally on consumer-grade hardware, offering a no-code approach to building customizable AI assistants and automations.
How It Works
LocalAGI leverages a Go backend with a React frontend, integrating with LocalAI for model inference and LocalRecall for memory management. Agents can be configured via a web UI or REST API, supporting complex behaviors like planning, reasoning, and teaming. Its architecture emphasizes local execution, eliminating the need for cloud services or API keys, and includes built-in connectors for various communication platforms.
Quick Start & Requirements
docker compose up
(CPU), docker compose -f docker-compose.nvidia.yaml up
(NVIDIA GPU), or docker compose -f docker-compose.intel.yaml up
(Intel GPU).Highlighted Details
Maintenance & Community
The project is actively maintained by mudler. Community links are not explicitly provided in the README.
Licensing & Compatibility
Limitations & Caveats
The project is primarily distributed via Docker, which may add a layer of complexity for users unfamiliar with containerization. While it supports CPU, performance will be significantly limited compared to GPU acceleration. Specific model compatibility and performance tuning may require user effort.
1 week ago
1 week