Discover and explore top open-source AI tools and projects—updated daily.
PublicAffairsOpenAI API proxy for the Gemini API, deployable serverlessly
Top 15.6% on SourcePulse
This project provides a serverless, OpenAI-compatible API proxy for Google's Gemini models, enabling users to leverage Gemini's capabilities through existing OpenAI API integrations. It targets developers and users who want to access advanced AI models without direct Gemini API integration or server maintenance, offering a free, personal endpoint with generous limits.
How It Works
The proxy acts as an intermediary, translating OpenAI API requests into Gemini API calls. It deploys as a serverless function on platforms like Vercel, Netlify, or Cloudflare Workers, abstracting away infrastructure management. This approach allows for easy, free deployment and maintenance, making powerful AI models accessible via familiar OpenAI endpoints.
Quick Start & Requirements
npm install and npm run start (Node.js), or equivalent for Deno/Bun.https://your-proxy.vercel.app/v1) and your Gemini API key.Highlighted Details
chat/completions endpoint with many OpenAI parameters mapped to Gemini equivalents.embeddings and models endpoints.inlineData.:search to model names.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
name property for messages and logit_bias, logprobs, top_logprobs, parallel_tool_calls parameters are not implemented. The n parameter has limitations for streaming. Temperature mapping may differ.2 days ago
1 day
njerschow