Proxy for Google Gemini
Top 71.5% on sourcepulse
This project provides a proxy server that translates OpenAI API requests to Google Gemini Pro API calls, enabling applications designed for OpenAI to seamlessly utilize Gemini models without code modifications. It targets developers and users who want to leverage Gemini's capabilities with existing OpenAI-compatible infrastructure.
How It Works
The proxy acts as an intermediary, intercepting requests formatted for OpenAI's API and reformatting them to match Google Gemini's protocol. It supports various Gemini models, mapping common OpenAI model names (like gpt-3.5-turbo
, gpt-4
) to their corresponding Gemini equivalents (e.g., gemini-1.5-flash-8b-latest
, gemini-1.5-pro-latest
). This approach allows for easy integration and model switching.
Quick Start & Requirements
deno task start:deno
, npm run start:node
, or bun run start:bun
.docker run -d -p 8000:8000 ghcr.io/zuisong/gemini-openai-proxy:<deno|bun|node>
Highlighted Details
/v1/chat/completions
).Maintenance & Community
The project is maintained by Zuisong. A star history graph is available on the README.
Licensing & Compatibility
The repository does not explicitly state a license in the provided README. This may pose compatibility issues for commercial or closed-source use cases.
Limitations & Caveats
The README does not specify a license, which could impact commercial adoption. Compatibility with all OpenAI API features beyond chat completions is not detailed.
1 month ago
1 day