Discover and explore top open-source AI tools and projects—updated daily.
Soju06ChatGPT account load balancer and proxy
Top 61.6% on SourcePulse
Summary
codex-lb addresses managing multiple ChatGPT accounts via a load balancer and proxy. It targets developers and power users needing centralized API access, usage tracking, and cost control, offering a unified, OpenAI-compatible interface.
How It Works
This project acts as a proxy pooling multiple ChatGPT accounts, distributing requests to balance load and optimize utilization. It features per-account usage tracking (tokens, cost), granular API key management with rate limits, and a web dashboard for monitoring and configuration. Its advantage lies in abstracting LLM API endpoint complexity behind a single, consistent interface.
Quick Start & Requirements
The recommended installation is via Docker:
docker volume create codex-lb-data
docker run -d --name codex-lb \
-p 2455:2455 -p 1455:1455 \
-v codex-lb-data:/var/lib/codex-lb \
ghcr.io/soju06/codex-lb:latest
Access the dashboard at http://localhost:2455 to add accounts. Clients point to http://127.0.0.1:2455/v1 or http://127.0.0.1:2455/backend-api/codex. API key authentication can be enabled via the dashboard.
Highlighted Details
Maintenance & Community
The project lists several contributors, including Soju06 and Jonas Kamsker. It follows the all-contributors specification, welcoming contributions. No specific community channels (e.g., Discord, Slack) are linked in the README.
Licensing & Compatibility
The project's license is not explicitly stated in the README, requiring clarification for adoption, especially concerning commercial use. It is designed for compatibility with OpenAI-compatible clients.
Limitations & Caveats
A significant caveat is the absence of a stated license. Some WebSocket features are experimental, and upstream Codex support for stable WebSocket APIs is noted as lacking. Reverse proxies must be configured to handle WebSocket upgrades.
1 day ago
Inactive
songquanpeng