Simple proxy for OpenAI API via Docker
Top 26.5% on sourcepulse
This project provides a simple, one-line Docker command deployable proxy for the OpenAI API, designed for developers needing to route their GPT requests through a custom endpoint. It offers SSE streaming support and optional Tencent Cloud text moderation, making it suitable for users in regions with restricted OpenAI access or those requiring content filtering.
How It Works
The proxy acts as an intermediary, forwarding requests to the OpenAI API and returning responses. It's built with Node.js and can be deployed to environments supporting Node.js 14+ or via a Docker image. A key feature is its support for Server-Sent Events (SSE) streaming, allowing for real-time content delivery, and an optional integration with Tencent Cloud's text moderation service for content filtering.
Quick Start & Requirements
docker run -p 9000:9000 easychen/ai.level06.com:latest
app.js
and package.json
, run yarn install
, then node app.js
.PORT
, PROXY_KEY
(for access control), TIMEOUT
, TENCENT_CLOUD_SID
, TENCENT_CLOUD_SKEY
, TENCENT_CLOUD_AP
(for moderation).Highlighted Details
Maintenance & Community
The project appears to be maintained by a single developer, easychen. There are no explicit links to community channels or roadmaps provided in the README.
Licensing & Compatibility
The README does not explicitly state a license. Compatibility for commercial use or closed-source linking is not specified.
Limitations & Caveats
The proxy only supports GET and POST methods, excluding file-related interfaces. The README mentions that SSE support was recently added, implying potential for early-stage issues or ongoing development in this area.
1 year ago
Inactive