Rust library for multi-provider generative AI client
Top 63.4% on sourcepulse
This Rust library provides a unified, ergonomic API for interacting with multiple generative AI providers, including OpenAI, Anthropic, Gemini, Ollama, and others. It's designed for developers building applications that need to leverage diverse LLM capabilities without managing multiple SDKs, offering a single interface for chat completions and streaming.
How It Works
The library implements a native Rust approach, avoiding per-service SDKs. It normalizes API differences at a lower layer, creating a common pattern for chat completion and streaming. This design prioritizes ergonomics and commonality, allowing developers to easily switch between or combine models from different providers using a consistent interface. Customization for endpoints, authentication, and model mapping is also supported.
Quick Start & Requirements
cargo add genai
.OPENAI_API_KEY
).Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
Currently, AsyncFn
traits are not fully supported due to Rust stable limitations. Ollama streaming responses may not include usage tokens due to an upstream limitation. Gemini stream usage is assumed to be cumulative.
1 day ago
1 day