rust-genai  by jeremychone

Rust library for multi-provider generative AI client

created 1 year ago
496 stars

Top 63.4% on sourcepulse

GitHubView on GitHub
Project Summary

This Rust library provides a unified, ergonomic API for interacting with multiple generative AI providers, including OpenAI, Anthropic, Gemini, Ollama, and others. It's designed for developers building applications that need to leverage diverse LLM capabilities without managing multiple SDKs, offering a single interface for chat completions and streaming.

How It Works

The library implements a native Rust approach, avoiding per-service SDKs. It normalizes API differences at a lower layer, creating a common pattern for chat completion and streaming. This design prioritizes ergonomics and commonality, allowing developers to easily switch between or combine models from different providers using a consistent interface. Customization for endpoints, authentication, and model mapping is also supported.

Quick Start & Requirements

  • Install via cargo add genai.
  • Requires Rust toolchain.
  • API keys for specific providers are typically set via environment variables (e.g., OPENAI_API_KEY).
  • See examples/c00-readme.rs for a quick overview.

Highlighted Details

  • Native support for OpenAI, Anthropic, Gemini, Ollama, Groq, xAI, DeepSeek, and Cohere.
  • Image analysis capabilities for OpenAI, Gemini, and Anthropic.
  • Supports custom endpoints, authentication, and model aliasing.
  • Includes streaming responses and detailed usage metadata.
  • Model mapping can be customized to resolve provider types.

Maintenance & Community

  • Released v0.2.0 on 2025-04-16 with API changes.
  • Active development with contributions noted for tracing, Gemini support, image support, and more.
  • Sponsored by BriteSnow.
  • GitHub: github.com/jeremychone/rust-genai

Licensing & Compatibility

  • Licensed under MIT.
  • Compatible with commercial and closed-source applications.

Limitations & Caveats

Currently, AsyncFn traits are not fully supported due to Rust stable limitations. Ollama streaming responses may not include usage tokens due to an upstream limitation. Gemini stream usage is assumed to be cumulative.

Health Check
Last commit

1 day ago

Responsiveness

1 day

Pull Requests (30d)
12
Issues (30d)
1
Star History
101 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.