API proxy for Azure OpenAI, converts OpenAI requests
Top 53.3% on sourcepulse
This project provides a proxy for Azure OpenAI API, enabling seamless integration with open-source ChatGPT web projects and overcoming regional access restrictions for OpenAI APIs. It targets developers and users who need to leverage Azure OpenAI services through an OpenAI-compatible interface.
How It Works
The proxy acts as an intermediary, translating incoming OpenAI API requests into Azure OpenAI API requests. It supports all Azure OpenAI APIs, models (including custom fine-tuned ones), and offers custom mapping between Azure deployment names and OpenAI model names. For APIs not supported by Azure, it provides mock responses, ensuring compatibility with client applications.
Quick Start & Requirements
docker pull ishadows/azure-openai-proxy:latest
docker run -d -p 8080:8080 --name=azure-openai-proxy \
--env AZURE_OPENAI_ENDPOINT={your azure endpoint} \
--env AZURE_OPENAI_MODEL_MAPPER={your custom model mapper} \
ishadows/azure-openai-proxy:latest
Highlighted Details
/v1/chat/completions
, /v1/completions
, and /v1/embeddings
./v1/models
interface and handles cross-domain OPTIONS requests.AZURE_OPENAI_MODEL_MAPPER
.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The proxy does not have built-in HTTPS support for forward proxy usage, requiring an external HTTPS proxy like Nginx. The project's last update was in April 2023, which may indicate limited ongoing maintenance.
1 year ago
1 week