SDK/proxy for calling 100+ LLM APIs using the OpenAI format
Top 1.5% on sourcepulse
LiteLLM provides a unified Python SDK and proxy server for interacting with over 100 large language model (LLM) APIs, abstracting away provider-specific differences. It targets developers and researchers needing to integrate multiple LLMs seamlessly, offering a consistent OpenAI-like API format for completion, embedding, and image generation calls, along with features like retries, fallbacks, and rate limiting.
How It Works
LiteLLM acts as a translation layer, converting incoming requests into the format required by the target LLM provider. It supports a wide array of providers including OpenAI, Azure, Bedrock, VertexAI, and HuggingFace. The core advantage is the consistent output format and the ability to switch between models and providers with minimal code changes, simplifying complex LLM orchestration.
Quick Start & Requirements
pip install litellm
openai>=1.0.0
, pydantic>=2.0.0
. API keys for desired LLM providers are required.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
openai>=1.0.0
.15 hours ago
1 day