Discover and explore top open-source AI tools and projects—updated daily.
ultrasevLLM API reverse proxy for unified access
Top 100.0% on SourcePulse
A Cloudflare Worker-based reverse proxy designed to unify access to various Large Language Model (LLM) APIs, including OpenAI, Gemini, and Groq. It presents a single, OpenAI-compatible API endpoint, allowing developers to integrate multiple LLM providers seamlessly using familiar SDKs and tools, simplifying LLM orchestration and potentially optimizing costs or performance.
How It Works
The project leverages Cloudflare Workers to create a serverless API gateway. Users deploy the provided JavaScript code (api/llm_api_proxy.js) to their own Cloudflare Worker instance. This worker intercepts API requests and routes them to the appropriate LLM backend based on the request path (e.g., /v2/gemini). By adhering to the OpenAI API specification, it ensures compatibility with existing client libraries and applications. For certain providers like Groq, an additional request relay layer using Vercel and FastAPI is employed.
Quick Start & Requirements
api/llm_api_proxy.js code as a new Cloudflare Worker.workers.dev domain may be restricted (e.g., China).https://llmapi.ultrasev.comHighlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
workers.dev domain may face accessibility issues in certain geographic locations, necessitating the use of a custom domain.1 year ago
Inactive
theopenco