ollama-rs  by pepperoni21

Rust SDK for the Ollama API

created 1 year ago
875 stars

Top 41.9% on sourcepulse

GitHubView on GitHub
Project Summary

This Rust library provides a straightforward interface for interacting with the Ollama API, targeting developers building applications that leverage local large language models. It simplifies common LLM operations like text generation, chat, model management, and embedding creation, enabling rapid integration of Ollama into Rust projects.

How It Works

The library utilizes Rust's async/await capabilities and the reqwest HTTP client to communicate with the Ollama server. It maps Ollama's REST API endpoints to idiomatic Rust functions, offering both standard and streaming responses for generation and model creation tasks. A Coordinator struct is introduced to manage tool usage and function calling, allowing LLMs to interact with external capabilities.

Quick Start & Requirements

  • Install via Cargo: ollama-rs = "0.3.1"
  • Or from master branch: ollama-rs = { git = "https://github.com/pepperoni21/ollama-rs.git", branch = "master" }
  • Requires an Ollama server running locally (default: localhost:11434).
  • Official examples are available for detailed usage.

Highlighted Details

  • Supports text completion, streaming completion, chat interactions (with history), and model management (list, show, create, copy, delete).
  • Includes functionality for generating text embeddings, both individually and in batches.
  • Features a Coordinator for LLM function calling and tool integration, with built-in tools like web search and a calculator.
  • Allows creation of custom tools using a #[function] macro for seamless integration.

Maintenance & Community

  • Actively maintained by pepperoni21.
  • Community support channels are not explicitly mentioned in the README.

Licensing & Compatibility

  • The library itself is likely licensed under an MIT or Apache 2.0 license, common for Rust crates, but not explicitly stated in the README.
  • Compatible with any Rust project that can run an Ollama server.

Limitations & Caveats

The README notes that the master branch may be unstable and contain breaking changes. Error handling in provided examples is minimal and should be robustly implemented in production code.

Health Check
Last commit

3 days ago

Responsiveness

1 day

Pull Requests (30d)
12
Issues (30d)
3
Star History
96 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.