future-agi  by future-agi

Platform for shipping self-improving AI agents

Created 4 days ago

New!

582 stars

Top 55.4% on SourcePulse

GitHubView on GitHub
1 Expert Loves This Project
Project Summary

FutureAGI is an open-source, self-hostable platform designed to streamline the development, evaluation, and production deployment of AI agents. It addresses the common challenge of fragmented tooling by offering an integrated solution for the entire AI agent lifecycle, from initial simulation to continuous self-improvement. This platform empowers engineers and researchers to build more robust and reliable AI agents, ultimately enabling them to ship self-improving AI applications faster.

How It Works

FutureAGI collapses the AI agent lifecycle—simulate, evaluate, protect, monitor, and optimize—into a single, cohesive platform with a unified feedback loop. This approach eliminates the need to stitch together disparate tools for observability, evaluation, and guardrails. By treating every trace as a signal for improvement, it facilitates the creation of agents that not only perform well but also continuously learn and adapt. The platform is built on an open, inspectable architecture, allowing users to understand and customize every component.

Quick Start & Requirements

  • Cloud: Sign up at app.futureagi.com (free tier available).
  • Self-Host (Docker): Clone the repository, copy .env.example to .env, and run docker compose up -d. Access at http://localhost:3031.
  • Self-Host (Kubernetes): Use helm repo add futureagi and helm install fagi futureagi/future-agi.
  • Prerequisites: Python 3.11+, Go 1.23+, React 18, Node 20+. Docker and Kubernetes are required for self-hosting.
  • Documentation: Docs →

Highlighted Details

  • All-in-One Lifecycle: Integrates simulation (text/voice), evaluation (50+ metrics including LLM-as-judge), protection (scanners/adapters), monitoring (OpenTelemetry-native), an OpenAI-compatible gateway, and prompt optimization (6 algorithms).
  • Production-Grade Gateway: Achieves high performance with ~9.9 ns weighted routing, ~29 k req/s on t3.xlarge, and P99 latency ≤ 21 ms with guardrails enabled.
  • Open & Extensible: Apache 2.0 licensed core, with all components inspectable. Supports integration via OpenTelemetry and OpenAI-compatible HTTP, allowing users to drop in their own stacks.
  • Extensive Integrations: Offers 50+ framework instrumentors and supports over 100 LLM providers, various voice platforms, and vector databases.

Maintenance & Community

The project is actively maintained with a strong community presence on Discord and GitHub Discussions. A public roadmap is available for community input. Contributions are welcomed, with a clear contributing guide and CLA process.

Licensing & Compatibility

Future AGI is licensed under the Apache License 2.0. This permissive license allows for commercial use, integration into closed-source applications, and modification without copyleft restrictions, ensuring no vendor lock-in. Users retain ownership of their evaluation logic and data.

Limitations & Caveats

The project is currently in a "Nightly release for early testing" phase, with potential rough edges expected. A stable version is forthcoming. The Kubernetes Helm chart is in v1 development.

Health Check
Last Commit

3 hours ago

Responsiveness

Inactive

Pull Requests (30d)
54
Issues (30d)
11
Star History
598 stars in the last 4 days

Explore Similar Projects

Feedback? Help us improve.