Discover and explore top open-source AI tools and projects—updated daily.
icebear0828Local proxy for AI programming models, compatible with OpenAI APIs
Top 59.3% on SourcePulse
Summary
Codex Proxy is a lightweight local service translating Codex Desktop's Responses API into standard OpenAI, Anthropic, and Gemini protocols. It enables seamless integration with AI clients, acting as a personalized AI programming assistant gateway.
How It Works
This project acts as an intermediary, converting client requests (OpenAI, Anthropic, Gemini) to Codex Desktop API format and vice-versa, supporting SSE streaming. It employs advanced anti-detection techniques, including Chrome TLS fingerprinting via curl-impersonate, request header spoofing, and cookie persistence, to mimic real Codex Desktop behavior. Key features include robust account management, multi-account rotation, plan routing, and automatic token renewal.
Quick Start & Requirements
curl. Docker is optional.http://localhost:8080.CHANGELOG.md available.Highlighted Details
/v1/chat/completions), Anthropic (/v1/messages), Gemini, and direct Codex (/v1/responses) APIs with SSE streaming.curl-impersonate for TLS fingerprinting, spoofs desktop headers, persists cookies, and auto-updates fingerprint versions.function_call / tool_calls across all protocols.Maintenance & Community
No specific maintainer or community details beyond a mention of a WeChat group for assistance. Updates are tracked in CHANGELOG.md.
Licensing & Compatibility
Limitations & Caveats
curl-impersonate is unavailable on Windows; Docker is recommended. Codex API is stream-only; stream: false requests collect the full stream before returning JSON. Project depends on Codex Desktop's public interfaces; upstream updates may require proxy adjustments.
1 day ago
Inactive