OpenAI API proxy for network access issues
Top 16.4% on sourcepulse
This project provides a free OpenAI API proxy using Cloudflare Workers, designed to circumvent network access issues and support streaming output. It's targeted at developers and users needing reliable access to OpenAI's models, offering a self-hostable solution to bypass geographical restrictions or network blocks.
How It Works
The core mechanism involves deploying a Cloudflare Worker that acts as an intermediary. It routes requests to api.openai.com
through a Cloudflare edge location, leveraging the widespread availability and often unblocked nature of Cloudflare's network. Users can then point their applications to this worker's URL, effectively proxying their OpenAI API calls.
Quick Start & Requirements
cf_worker.js
code into a new Cloudflare Worker and bind a domain.docker run -itd --name openaiproxy -p 3000:3000 --restart=always gindex/openaiproxy:latest
API endpoint: http://vpsip:3000/proxy/v1/chat/completions
Highlighted Details
curl
, JavaScript fetch
, Python requests
, and the chatgpt-api
Node.js library.Maintenance & Community
No specific contributors, sponsorships, or community links (Discord/Slack) are mentioned in the README.
Licensing & Compatibility
The repository does not explicitly state a license.
Limitations & Caveats
Some IP addresses may encounter access issues, as indicated by a screenshot in the README. The Docker deployment method is noted as potentially not supporting Server-Sent Events (SSE). The provided demo sites may experience high load.
1 month ago
1 day