API proxy for OpenAI-compatible ChatGPT access
Top 67.8% on SourcePulse
This project provides a proxy server that exposes the reverse-engineered poe-api
Python library as an HTTP API, mimicking the OpenAI API. It allows users to connect existing OpenAI API-dependent applications to Poe.com's free ChatGPT services, effectively enabling free access to advanced language models.
How It Works
The project utilizes a Python backend that wraps the poe-api
library to interact with Poe.com. A Go backend then exposes this functionality via an HTTP API, specifically designed to mirror the endpoints and parameters of the official OpenAI API for chat completions. This approach allows for seamless integration with applications expecting the standard OpenAI interface.
Quick Start & Requirements
pip install -r external/requirements.txt
cp config.example.toml config.toml
and edit config.toml
.python external/api.py
go build && ./poe-openai-proxy
docker-compose up -d
after creating config.toml
.Highlighted Details
/models
, /chat/completions
, /v1/models
, /v1/chat/completions
.model
, messages
, and stream
parameters, mapping Poe bot nicknames to model names.http://localhost:3700
.Maintenance & Community
The project is a wrapper around the poe-api
library found at https://github.com/ading2210/poe-api
. No specific community channels or active maintenance signals are present in the README.
Licensing & Compatibility
The README does not specify a license. The project depends on poe-api
, whose license is also not specified in this README. Compatibility for commercial use or closed-source linking is not addressed.
Limitations & Caveats
This project relies on a reverse-engineered API for Poe.com, which is subject to change or discontinuation by Quora without notice. The "free" access is dependent on Poe.com's terms of service and availability. The name
parameter is not supported in messages
.
1 month ago
1 day