OpenAOE  by InternLM

LLM group chat framework for parallel LLM outputs

created 1 year ago
313 stars

Top 87.4% on sourcepulse

GitHubView on GitHub
Project Summary

OpenAOE is a framework designed for simultaneous interaction with multiple Large Language Models (LLMs), enabling users to send a single prompt and receive parallel responses. It targets LLM researchers, evaluators, and developers, simplifying the process of comparing and utilizing various commercial and open-source LLMs.

How It Works

OpenAOE facilitates "LLM Group Chat" (LGC) by providing a unified interface to interact with diverse LLMs. It supports both single-model serial responses and multi-model parallel responses. The framework abstracts away the complexities of integrating different LLM APIs, allowing users to configure and access models via a config-template.yaml file, which specifies backend and frontend settings.

Quick Start & Requirements

  • Install: pip install -U openaoe or via Docker (docker pull opensealion/openaoe:latest).
  • Run: openaoe -f /path/to/your/config-template.yaml (or docker run ...).
  • Prerequisites: Python >= 3.9. Requires API keys for commercial LLMs.
  • Configuration: A config-template.yaml file is necessary to define LLM access details.
  • Docs: English, 中文

Highlighted Details

  • Supports parallel responses from multiple LLMs with a single prompt.
  • Integrates with commercial LLM APIs (GPT3.5, GPT4, Gemini-Pro, etc.) and open-source LLMs (via LMDeploy, Ollama).
  • Provides both backend APIs and a Web UI.
  • Recent updates include support for Gemma-7b, Qwen-7b (via Ollama), and Mistral-7b.

Maintenance & Community

  • Project is actively developed with recent commits and releases.
  • Open source with a call for contributions and feature additions.

Licensing & Compatibility

  • The repository does not explicitly state a license in the provided README. Compatibility for commercial use or closed-source linking is not specified.

Limitations & Caveats

The README does not specify a license, which may impact commercial adoption. Users are responsible for obtaining and configuring API keys for commercial models. The config-template.yaml requires manual population with sensitive API access data.

Health Check
Last commit

1 month ago

Responsiveness

1 day

Pull Requests (30d)
0
Issues (30d)
0
Star History
18 stars in the last 90 days

Explore Similar Projects

Starred by Chip Huyen Chip Huyen(Author of AI Engineering, Designing Machine Learning Systems), Alexey Milovidov Alexey Milovidov(Cofounder of Clickhouse), and
7 more.

OpenLLM by bentoml

0.2%
12k
SDK for running open-source LLMs as OpenAI-compatible APIs
created 2 years ago
updated 4 days ago
Feedback? Help us improve.