Discover and explore top open-source AI tools and projects—updated daily.
pepperoni21Rust SDK for the Ollama API
Top 39.3% on SourcePulse
This Rust library provides a straightforward interface for interacting with the Ollama API, targeting developers building applications that leverage local large language models. It simplifies common LLM operations like text generation, chat, model management, and embedding creation, enabling rapid integration of Ollama into Rust projects.
How It Works
The library utilizes Rust's async/await capabilities and the reqwest HTTP client to communicate with the Ollama server. It maps Ollama's REST API endpoints to idiomatic Rust functions, offering both standard and streaming responses for generation and model creation tasks. A Coordinator struct is introduced to manage tool usage and function calling, allowing LLMs to interact with external capabilities.
Quick Start & Requirements
ollama-rs = "0.3.1"ollama-rs = { git = "https://github.com/pepperoni21/ollama-rs.git", branch = "master" }localhost:11434).Highlighted Details
Coordinator for LLM function calling and tool integration, with built-in tools like web search and a calculator.#[function] macro for seamless integration.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The README notes that the master branch may be unstable and contain breaking changes. Error handling in provided examples is minimal and should be robustly implemented in production code.
1 week ago
1 day
imaurer