Azure OpenAI proxy for OpenAI API conversion
Top 30.4% on sourcepulse
This project provides a proxy service to bridge the OpenAI API with Azure OpenAI Service, enabling seamless integration of Azure's offerings into the existing OpenAI ecosystem. It's designed for developers and users who want to leverage Azure OpenAI models (including GPT-4 and Embeddings) with tools and applications built for the standard OpenAI API, such as Langchain and various ChatGPT frontends, at zero cost.
How It Works
The proxy intercepts requests intended for the OpenAI API and transforms them into the format required by Azure OpenAI Service. It handles model name mapping, endpoint conversion, and API key authentication, abstracting away the differences between the two services. This approach allows applications to communicate with Azure OpenAI as if they were interacting with the standard OpenAI API, simplifying migration and adoption.
Quick Start & Requirements
docker run -d -p 8080:8080 --name=azure-openai-proxy \
--env AZURE_OPENAI_ENDPOINT=your_azure_endpoint \
--env AZURE_OPENAI_API_VER=your_azure_api_ver \
--env AZURE_OPENAI_MODEL_MAPPER=your_azure_deploy_mapper \
stulzq/azure-openai-proxy:latest
AZURE_OPENAI_ENDPOINT
, AZURE_OPENAI_API_VER
, AZURE_OPENAI_MODEL_MAPPER
) or a config.yaml
file.Highlighted Details
chatgpt-web
, chatbox
, and ChatGPT-Next-Web
.config.yaml
file.Maintenance & Community
stulzq
).Licensing & Compatibility
Limitations & Caveats
1 year ago
1 day