Discover and explore top open-source AI tools and projects—updated daily.
1rgsAPI proxy for using Anthropic clients with other backends
Top 19.0% on SourcePulse
This project provides a proxy server that allows users of Anthropic's API clients, such as Claude Code, to leverage models from OpenAI and Google Gemini. It's designed for developers and researchers who want to use familiar Anthropic client interfaces with alternative, potentially more cost-effective or performant, LLM backends.
How It Works
The proxy intercepts requests formatted for the Anthropic API and translates them into the appropriate format for either OpenAI or Google Gemini using the LiteLLM library. It then forwards the request to the chosen backend, receives the response, and converts it back into the Anthropic API format before returning it to the client. This approach enables seamless integration with existing Anthropic-compatible tools.
Quick Start & Requirements
uv: uv run uvicorn server:app --host 0.0.0.0 --port 8082 --reloaduv installed.openai or google) via .env file.ANTHROPIC_BASE_URL=http://localhost:8082 and use Anthropic clients.Highlighted Details
haiku and sonnet models to specified OpenAI or Gemini models.openai/ or gemini/).Maintenance & Community
Contributions are welcome via Pull Requests. No specific community channels or contributor information is detailed in the README.
Licensing & Compatibility
The repository does not explicitly state a license. Compatibility for commercial use or closed-source linking is not specified.
Limitations & Caveats
The project does not specify a license, which may impact commercial adoption. Detailed information on supported Gemini models beyond the defaults is not provided, and the project appears to be primarily focused on OpenAI and Gemini as backends.
2 months ago
Inactive