Discover and explore top open-source AI tools and projects—updated daily.
gbaptistaRuby SDK for local LLM interaction via Ollama API
Top 99.9% on SourcePulse
This Ruby gem provides a low-level interface to interact with Ollama's API, enabling developers to run open-source Large Language Models (LLMs) locally. It serves as a foundational tool for building custom AI-powered applications and abstractions on top of Ollama, targeting Ruby developers who need direct control over LLM interactions.
How It Works
The gem acts as a direct client for Ollama's REST API, exposing methods that map to Ollama's core functionalities. It handles request serialization, response parsing, and streaming capabilities via Server-Sent Events (SSE). This approach offers granular control over model interactions, including text generation, chat completions, embeddings, and model management, without introducing higher-level abstractions that might limit flexibility.
Quick Start & Requirements
gem 'ollama-ai', '~> 1.3.0'gem install ollama-ai -v 1.3.0http://localhost:11434).Highlighted Details
Maintenance & Community
The README details development tasks and publishing procedures but does not list specific maintainers, community channels (like Discord/Slack), or a public roadmap.
Licensing & Compatibility
Limitations & Caveats
This gem provides low-level access and requires developers to build their own higher-level abstractions. It is dependent on a separately running Ollama service. Image processing requires manual Base64 encoding and may necessitate increased client timeouts.
1 year ago
Inactive
abhishekkrthakur
smol-ai