HTTP proxy for OpenAI API
Top 59.8% on sourcepulse
This Go-based HTTP proxy is designed for OpenAI API access, offering a flexible solution for users facing regional restrictions or seeking to manage their API calls. It supports self-hosting on cloud functions (Tencent, Alibaba, AWS Lambda) or traditional servers, with a focus on ease of deployment and configuration.
How It Works
The proxy acts as an intermediary, forwarding requests to specified endpoints. It defaults to api.openai.com
but can be configured to use Azure OpenAI endpoints or any arbitrary global domain via the X-Target-Host
header. This design allows users to route traffic through different providers or custom endpoints, offering adaptability.
Quick Start & Requirements
./build.sh
(requires Go toolchain).X-Target-Host
header or command-line arguments (-domain
, -port
) to specify target domains and ports.Highlighted Details
X-Target-Host
header.proxy.geekai.co
.Maintenance & Community
The project is maintained by geekai-dev. Further community or roadmap information is not detailed in the README.
Licensing & Compatibility
The README does not explicitly state a license. Compatibility for commercial use or closed-source linking is not specified.
Limitations & Caveats
Stream response functionality is limited on standard cloud function deployments due to chunked transfer limitations; a dedicated server environment is required for full stream support. The project's license is not specified, which may impact commercial adoption.
2 months ago
1 week