openai-style-api  by tian-minghui

API proxy for OpenAI-style LLM access

created 1 year ago
416 stars

Top 71.5% on sourcepulse

GitHubView on GitHub
Project Summary

This project provides a unified API gateway for various large language models (LLMs), abstracting away differences in their APIs and allowing users to interact with them using a single, OpenAI-compatible interface. It's designed for developers and users who need to integrate multiple LLMs into their applications or manage API key distribution efficiently.

How It Works

The project acts as a reverse proxy, accepting requests in the OpenAI API format and routing them to the appropriate LLM backend based on configuration. It supports various LLM providers (OpenAI, Azure OpenAI, Claude, Gemini, Kimi, Zhipu AI, Xunfei Spark, Qwen) and allows for custom parameter mapping and load balancing across multiple API keys or models.

Quick Start & Requirements

  • Installation: Docker or local Python installation.
    • Docker: docker run -d -p 8090:8090 --name openai-style-api -e ADMIN-TOKEN=admin -v /path/to/your/model-config.json:/app/model-config.json tianminghui/openai-style-api /path/to/your/model-config.json
    • Local: git clone https://github.com/tian-minghui/openai-style-api.git, pip install -r requirements.txt, python open-api.py
  • Prerequisites: A model-config.json file is required for API key and model configuration.
  • Resources: Minimal resource requirements for local deployment.
  • Docs: Configuration examples are provided in the README.

Highlighted Details

  • Supports OpenAI, Azure OpenAI, Claude (web & API), Gemini, Kimi, Zhipu AI, Xunfei Spark, Qwen, and Bing Chat.
  • Features stream support, load balancing (round-robin, random, parallel), and model routing based on model_name.
  • Includes an online configuration update interface (http://0.0.0.0:8090/).
  • Allows for API key management and secondary distribution.

Maintenance & Community

The project is maintained by a single developer, with a note encouraging community contributions (issues and PRs). No specific community channels or roadmap links are provided in the README.

Licensing & Compatibility

The project's license is not explicitly stated in the README. Compatibility for commercial use or closed-source linking would depend on the underlying licenses of the LLM APIs it integrates with.

Limitations & Caveats

The developer notes that model updates may not be timely due to limited personal bandwidth. Claude API support is pending testing due to an API waitlist. Some models (Baidu Wenxin Yiyan) are not yet supported.

Health Check
Last commit

1 year ago

Responsiveness

1 week

Pull Requests (30d)
0
Issues (30d)
0
Star History
12 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.