claude-code-proxy  by fuergaosi233

Proxy for Claude Code to OpenAI API

created 1 month ago
912 stars

Top 40.7% on sourcepulse

GitHubView on GitHub
Project Summary

This project provides a proxy server that translates Claude API requests into OpenAI-compatible API calls. It enables users to leverage various LLM providers, including OpenAI, Azure OpenAI, and local Ollama models, through the Claude Code CLI. The primary benefit is unified access to different LLM backends using a familiar Claude-like interface.

How It Works

The proxy intercepts requests to its /v1/messages endpoint, which mimic the Claude API. It then intelligently maps the requested Claude model (e.g., Haiku, Sonnet, Opus) to user-configured OpenAI models via environment variables (SMALL_MODEL, MIDDLE_MODEL, BIG_MODEL). The request is transformed into an OpenAI API format and forwarded to the specified OPENAI_BASE_URL. Responses, including streaming and function calling, are handled and returned in a compatible format.

Quick Start & Requirements

  • Install: uv sync or pip install -r requirements.txt
  • Configure: Copy .env.example to .env and set OPENAI_API_KEY and optionally OPENAI_BASE_URL.
  • Run: python start_proxy.py or uv run claude-code-proxy
  • Usage: Set ANTHROPIC_BASE_URL=http://localhost:8082 and use the claude CLI.
  • Dependencies: Python 3.x, uv (recommended).

Highlighted Details

  • Full Claude API /v1/messages endpoint support.
  • Supports OpenAI, Azure OpenAI, and Ollama (or any OpenAI-compatible API).
  • Smart model mapping for Claude Haiku, Sonnet, and Opus to configurable OpenAI models.
  • Complete function calling and SSE streaming response support.
  • Image input via Base64 encoding is supported.

Maintenance & Community

The project is maintained by fuergaosi233. No specific community channels or roadmap links are provided in the README.

Licensing & Compatibility

  • License: MIT License.
  • Compatibility: Permissive MIT license allows for commercial use and integration with closed-source applications.

Limitations & Caveats

The proxy relies on environment variables for configuration, which might require careful management in production environments. While it supports various model types, the actual performance and capabilities depend entirely on the backend LLM provider configured.

Health Check
Last commit

2 weeks ago

Responsiveness

Inactive

Pull Requests (30d)
11
Issues (30d)
10
Star History
953 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.