Discover and explore top open-source AI tools and projects—updated daily.
API proxy for Pandora-Next
Top 60.3% on SourcePulse
This project provides a compatibility layer to expose Pandora-Next's backend API as a standard OpenAI v1 API, specifically targeting /v1/chat/completions
and /v1/images/generations
. It's designed for users who want to integrate Pandora-Next with other applications that expect OpenAI's API format, offering features like streaming, DALL-E image generation, and support for various GPT models and GPTS.
How It Works
The core functionality involves proxying requests from the standard OpenAI v1 endpoints to Pandora-Next's internal /backend-api
endpoints. It handles both streaming and non-streaming responses for chat completions and supports DALL-E image generation requests. The project also includes features for API key management for GPTS, custom endpoint prefixes, and a "Bot Mode" for integrating with chat bots, which allows for customized output formatting of tool usage and results.
Quick Start & Requirements
docker-compose.yml
.auto_conv_arkose:true
enabled.docker-compose.yml
for Pandora-Next URL, API prefix, logging, worker/thread counts, and model aliases.Highlighted Details
gpt-4-s
, gpt-4-mobile
, gpt-3.5-turbo
, and GPTS.Maintenance & Community
The project acknowledges contributions from various individuals. A Telegram channel is available for community discussion and support.
Licensing & Compatibility
The project is released under the MIT License, allowing for commercial use and integration with closed-source applications.
Limitations & Caveats
The project explicitly states it does not bypass OpenAI or PandoraNext official limitations and is not designed for high-concurrency scenarios due to its proxy nature. It also warns that using generated Arkose tokens may lead to account bans, with all consequences borne by the user.
1 year ago
Inactive