API conversion layer for Coze
Top 52.9% on sourcepulse
This project enables users to leverage Coze's AI capabilities, including its LLMs, knowledge bases, plugins, and workflows, through any OpenAI-compatible client. It acts as a proxy, translating Coze API requests into the OpenAI API format, benefiting users who prefer existing OpenAI client ecosystems.
How It Works
The project functions as a reverse proxy, intercepting requests intended for the OpenAI API and reformatting them for the Coze API. It supports both streaming and blocking responses, offering flexibility in how users interact with the Coze models. The core advantage lies in its ability to unify access to Coze's advanced features within familiar OpenAI client interfaces, simplifying integration and enhancing user experience.
Quick Start & Requirements
docker-compose up -d
(after cloning and configuring .env
) or pnpm install && pnpm start
for local deployment..env
, installing dependencies, and running. Docker deployment requires Docker and Docker Compose.Highlighted Details
BOT_CONFIG
environment variable.Maintenance & Community
The project is actively maintained, with a roadmap indicating future support for image and audio features. Contact information via X and Telegram is provided for feedback and questions.
Licensing & Compatibility
Licensed under the MIT License, permitting commercial use and integration with closed-source applications.
Limitations & Caveats
Vercel deployment has a 10-second timeout limit for serverless functions, which may impact longer-running Coze operations. Image and audio support are listed as "Coming Soon."
9 months ago
1 day