Proxy for Claude Code to OpenAI API
Top 40.7% on sourcepulse
This project provides a proxy server that translates Claude API requests into OpenAI-compatible API calls. It enables users to leverage various LLM providers, including OpenAI, Azure OpenAI, and local Ollama models, through the Claude Code CLI. The primary benefit is unified access to different LLM backends using a familiar Claude-like interface.
How It Works
The proxy intercepts requests to its /v1/messages
endpoint, which mimic the Claude API. It then intelligently maps the requested Claude model (e.g., Haiku, Sonnet, Opus) to user-configured OpenAI models via environment variables (SMALL_MODEL
, MIDDLE_MODEL
, BIG_MODEL
). The request is transformed into an OpenAI API format and forwarded to the specified OPENAI_BASE_URL
. Responses, including streaming and function calling, are handled and returned in a compatible format.
Quick Start & Requirements
uv sync
or pip install -r requirements.txt
.env.example
to .env
and set OPENAI_API_KEY
and optionally OPENAI_BASE_URL
.python start_proxy.py
or uv run claude-code-proxy
ANTHROPIC_BASE_URL=http://localhost:8082
and use the claude
CLI.uv
(recommended).Highlighted Details
/v1/messages
endpoint support.Maintenance & Community
The project is maintained by fuergaosi233. No specific community channels or roadmap links are provided in the README.
Licensing & Compatibility
Limitations & Caveats
The proxy relies on environment variables for configuration, which might require careful management in production environments. While it supports various model types, the actual performance and capabilities depend entirely on the backend LLM provider configured.
2 weeks ago
Inactive