Discover and explore top open-source AI tools and projects—updated daily.
zhiyu1998Gemini API server with OpenAI compatibility
Top 96.8% on SourcePulse
Summary Gemi2Api-Server provides a self-hosted backend that exposes Google Gemini's capabilities via an OpenAI-compatible API. It targets developers and power users seeking to integrate Gemini models into applications, offering a convenient alternative to official APIs by leveraging browser cookies for authentication.
How It Works
This project implements a FastAPI server acting as a proxy to the Gemini web interface. It requires authentication cookies (__Secure-1PSID, __Secure-1PSIDTS) from a logged-in Gemini session, or an optional API_KEY. The server forwards requests and returns responses, mimicking OpenAI's chat completion API structure for easier integration with existing client libraries.
Quick Start & Requirements
Install via uv (uv init, uv add fastapi uvicorn gemini-webapi, uv sync) or pip (pip install fastapi uvicorn gemini-webapi). Run with uvicorn main:app --reload --host 127.0.0.1 --port 8000. Docker is recommended: clone the repo, copy .env.example to .env, fill cookies, and run docker-compose up -d. Service is at http://0.0.0.0:8000. Requires Gemini browser cookies.
Highlighted Details
POST /v1/chat/completions).API_KEY.GET /) and model listing (GET /v1/models).Maintenance & Community Acknowledges developer contributions but lacks specific community channel links or a roadmap.
Licensing & Compatibility The license type is not explicitly stated. Compatibility for commercial use requires verification.
Limitations & Caveats Relies on browser cookies that expire frequently, necessitating updates. Common 500 errors may arise from IP restrictions or rate limits. Obtaining cookies requires browser developer tools. Stability may depend on Google not altering the Gemini web interface structure.
1 month ago
Inactive