Discover and explore top open-source AI tools and projects—updated daily.
Gemini API proxy and load balancer for domestic access
Top 66.4% on SourcePulse
This project provides a lightweight, serverless proxy and load balancer for the Gemini API, enabling free access to Gemini models from within China. It aggregates multiple Gemini API keys, distributing requests randomly to effectively multiply available API capacity. The target audience includes developers and users in regions with restricted access to Google's AI services.
How It Works
The core functionality is built using edge computing platforms like Vercel, Deno Deploy, and Cloudflare Workers. It acts as a transparent proxy, forwarding requests to the Gemini API while managing multiple API keys. By concatenating keys and distributing requests, it bypasses potential rate limits and increases overall throughput. The project also offers compatibility with the OpenAI API format for easier integration into existing applications.
Quick Start & Requirements
Highlighted Details
/verify
endpoint for validating multiple API keys simultaneously.Maintenance & Community
The project is maintained by "技术爬爬虾" (Tech Shrimp), with links provided to their Bilibili, YouTube, Douyin, and WeChat official accounts. There is no explicit mention of a community forum or roadmap.
Licensing & Compatibility
The repository does not explicitly state a license. The README implies free usage and redistribution of the proxy service, but the underlying Gemini API usage is subject to Google's terms of service. Compatibility with closed-source applications is high due to the OpenAI-compatible endpoint.
Limitations & Caveats
Cloudflare Worker deployments may experience issues if assigned Hong Kong CDN nodes, as Gemini disallows connections from Hong Kong IPs. Netlify deployments are noted as potentially unstable. The project relies on free Gemini API keys, which are subject to Google's availability and potential changes.
1 month ago
Inactive