Discover and explore top open-source AI tools and projects—updated daily.
MirrowelLLM API proxy unifying diverse providers with resilience
Top 88.8% on SourcePulse
Summary
Mirrowel/LLM-API-Key-Proxy provides a self-hosted, universal LLM gateway with a single OpenAI-compatible API endpoint. It targets applications needing to integrate multiple LLM providers (OpenAI, Gemini, Anthropic) without code changes. The proxy enhances resilience via automatic API key management, rotation, and failover, simplifying LLM orchestration and improving uptime.
How It Works
The system comprises a FastAPI API Proxy and a Python Resilience Library. The proxy exposes a unified /v1/chat/completions endpoint, routing requests to configured LLM backends using a provider/model_name format. The library manages API key lifecycles, including intelligent selection, rotation, error-based failover, rate limit handling, and escalating cooldowns, ensuring robust LLM access.
Quick Start & Requirements
pip install -r requirements.txt, run python src/proxy_app/main.py.PROXY_API_KEY and provider credentials via .env file or interactive TUI.Highlighted Details
Licensing & Compatibility
The API Proxy component (src/proxy_app/) uses the permissive MIT License. The Resilience Library (src/rotator_library/) is under LGPL-3.0. LGPL-3.0 allows linking from closed-source applications, but modifications to the library itself must be released under LGPL-3.0.
Limitations & Caveats
Setup involves managing multiple API keys and potentially complex OAuth flows for certain providers. Providers like "Antigravity" and "Gemini CLI" access cutting-edge or internal APIs subject to Google's policy changes. The LGPL-3.0 license for the resilience library requires careful review for users integrating into proprietary software, especially regarding distribution of modified library versions.
2 days ago
Inactive
theopenco
songquanpeng