LLM-API-Key-Proxy  by Mirrowel

LLM API proxy unifying diverse providers with resilience

Created 7 months ago
301 stars

Top 88.8% on SourcePulse

GitHubView on GitHub
Project Summary

Summary

Mirrowel/LLM-API-Key-Proxy provides a self-hosted, universal LLM gateway with a single OpenAI-compatible API endpoint. It targets applications needing to integrate multiple LLM providers (OpenAI, Gemini, Anthropic) without code changes. The proxy enhances resilience via automatic API key management, rotation, and failover, simplifying LLM orchestration and improving uptime.

How It Works

The system comprises a FastAPI API Proxy and a Python Resilience Library. The proxy exposes a unified /v1/chat/completions endpoint, routing requests to configured LLM backends using a provider/model_name format. The library manages API key lifecycles, including intelligent selection, rotation, error-based failover, rate limit handling, and escalating cooldowns, ensuring robust LLM access.

Quick Start & Requirements

  • Installation: Pre-compiled binaries available for Windows, macOS, Linux. Source install: clone repo, pip install -r requirements.txt, run python src/proxy_app/main.py.
  • Prerequisites: Python 3.x.
  • Configuration: Requires PROXY_API_KEY and provider credentials via .env file or interactive TUI.
  • Documentation: Links to GitHub Releases and Deployment Guide.

Highlighted Details

  • Universal Compatibility: Integrates with any app supporting custom OpenAI base URLs.
  • Multi-Provider Support: Connects to Gemini, OpenAI, Anthropic, and others via one endpoint.
  • Built-in Resilience: Features automatic key rotation, failover, rate limit handling, and cooldowns.
  • Exclusive Providers: Supports custom integrations like Antigravity (Gemini 3 + Claude Opus 4.5) and Gemini CLI.
  • Interactive TUI: Simplifies configuration and credential management.

Licensing & Compatibility

The API Proxy component (src/proxy_app/) uses the permissive MIT License. The Resilience Library (src/rotator_library/) is under LGPL-3.0. LGPL-3.0 allows linking from closed-source applications, but modifications to the library itself must be released under LGPL-3.0.

Limitations & Caveats

Setup involves managing multiple API keys and potentially complex OAuth flows for certain providers. Providers like "Antigravity" and "Gemini CLI" access cutting-edge or internal APIs subject to Google's policy changes. The LGPL-3.0 license for the resilience library requires careful review for users integrating into proprietary software, especially regarding distribution of modified library versions.

Health Check
Last Commit

2 days ago

Responsiveness

Inactive

Pull Requests (30d)
31
Issues (30d)
19
Star History
154 stars in the last 30 days

Explore Similar Projects

Starred by Chip Huyen Chip Huyen(Author of "AI Engineering", "Designing Machine Learning Systems") and David Cramer David Cramer(Cofounder of Sentry).

llmgateway by theopenco

1.8%
824
LLM API gateway for unified provider access
Created 9 months ago
Updated 2 days ago
Feedback? Help us improve.