Discover and explore top open-source AI tools and projects—updated daily.
r2d4OpenAI-compatible Python client for calling LLMs
Top 76.1% on SourcePulse
This library provides an OpenAI-compatible Python client for interacting with various Large Language Models (LLMs) beyond OpenAI's offerings. It targets developers and researchers who want to easily switch between or query multiple LLM providers, including HuggingFace and Cohere, using a unified API, thereby simplifying experimentation and model selection.
How It Works
OpenLM acts as a drop-in replacement for the OpenAI Python client, mimicking its Completion.create API structure. It achieves cross-provider compatibility by directly calling the inference APIs of supported services. This approach minimizes dependencies and overhead, allowing users to send multiple prompts to multiple models concurrently within a single request, returning a consolidated response.
Quick Start & Requirements
pip install openlmHighlighted Details
Completion.create.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The library currently only supports the Completion endpoint; other standardized endpoints like ChatCompletion or Embeddings are planned but not yet implemented. The README does not specify version requirements for Python or other dependencies.
2 years ago
Inactive
njerschow