ProxyLLM  by zhalice2011

Local API proxy for unifying LLM web services

Created 3 weeks ago

New!

315 stars

Top 86.1% on SourcePulse

GitHubView on GitHub
Project Summary

ProxyLLM is a local Electron application designed to unify the capabilities of various Large Language Model (LLM) websites. It achieves this by capturing browser sessions, extracting necessary credentials, and exposing a unified OpenAI-compatible API endpoint. This allows developers to seamlessly integrate diverse LLM services into their applications, with a notable feature being one-click integration for tools like Claude Code.

How It Works

The application operates by launching target LLM websites within its own Electron windows, actively monitoring network traffic. It employs sophisticated methods like Electron's webRequest API, Chrome DevTools Protocol (CDP), and an optional local MITM proxy to capture authentication tokens, cookies, and session IDs. A flexible adapter system translates requests from the unified OpenAI-compatible format to the specific protocols of individual LLM sites, enabling dynamic model discovery and management through a control panel.

Quick Start & Requirements

Installation involves standard Node.js package management:

  1. npm install
  2. npm --prefix renderer install
  3. npm --prefix renderer run build
  4. npm run build
  5. npm run start The API server defaults to 127.0.0.1:8080. Development mode with hot-reloading is available via npm --prefix renderer run dev and npm run dev. No specific hardware or OS prerequisites beyond Node.js are mentioned, but interaction with web LLMs implies standard internet connectivity.

Highlighted Details

  • Provides OpenAI-compatible API endpoints (/v1/chat/completions, /v1/models) and Anthropic's native /v1/messages endpoint.
  • Supports credential capture via webRequest, CDP, and optional MITM proxy, with OAuth device code/PKCE flows for services like Gemini and Qwen.
  • Features a multi-site control panel for managing target LLM websites, viewing captured requests, and selecting credentials.
  • Enables seamless integration with Claude Code by allowing its CLI to be pointed to the local ProxyLLM API.

Maintenance & Community

The provided README does not contain specific details regarding maintainers, community channels (like Discord/Slack), sponsorships, or a public roadmap.

Licensing & Compatibility

The project is released under the MIT License. This permissive license generally allows for commercial use, modification, and distribution without significant restrictions, making it compatible with closed-source applications.

Limitations & Caveats

The MITM proxy certificate is only trusted within the Electron window and is not installed system-wide. Users must comply with the terms of service of the LLM websites they integrate. The README does not detail specific performance benchmarks or known bugs.

Health Check
Last Commit

3 weeks ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
315 stars in the last 25 days

Explore Similar Projects

Starred by Chip Huyen Chip Huyen(Author of "AI Engineering", "Designing Machine Learning Systems") and David Cramer David Cramer(Cofounder of Sentry).

llmgateway by theopenco

1.8%
824
LLM API gateway for unified provider access
Created 9 months ago
Updated 2 days ago
Feedback? Help us improve.