LLM group chat framework for parallel LLM outputs
Top 87.4% on sourcepulse
OpenAOE is a framework designed for simultaneous interaction with multiple Large Language Models (LLMs), enabling users to send a single prompt and receive parallel responses. It targets LLM researchers, evaluators, and developers, simplifying the process of comparing and utilizing various commercial and open-source LLMs.
How It Works
OpenAOE facilitates "LLM Group Chat" (LGC) by providing a unified interface to interact with diverse LLMs. It supports both single-model serial responses and multi-model parallel responses. The framework abstracts away the complexities of integrating different LLM APIs, allowing users to configure and access models via a config-template.yaml
file, which specifies backend and frontend settings.
Quick Start & Requirements
pip install -U openaoe
or via Docker (docker pull opensealion/openaoe:latest
).openaoe -f /path/to/your/config-template.yaml
(or docker run ...
).config-template.yaml
file is necessary to define LLM access details.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The README does not specify a license, which may impact commercial adoption. Users are responsible for obtaining and configuring API keys for commercial models. The config-template.yaml
requires manual population with sensitive API access data.
1 month ago
1 day