ollama-ai  by gbaptista

Ruby SDK for local LLM interaction via Ollama API

Created 1 year ago
251 stars

Top 99.9% on SourcePulse

GitHubView on GitHub
1 Expert Loves This Project
Project Summary

This Ruby gem provides a low-level interface to interact with Ollama's API, enabling developers to run open-source Large Language Models (LLMs) locally. It serves as a foundational tool for building custom AI-powered applications and abstractions on top of Ollama, targeting Ruby developers who need direct control over LLM interactions.

How It Works

The gem acts as a direct client for Ollama's REST API, exposing methods that map to Ollama's core functionalities. It handles request serialization, response parsing, and streaming capabilities via Server-Sent Events (SSE). This approach offers granular control over model interactions, including text generation, chat completions, embeddings, and model management, without introducing higher-level abstractions that might limit flexibility.

Quick Start & Requirements

Highlighted Details

  • Comprehensive API coverage: Supports text generation, chat, embeddings, and model management (create, tags, show, copy, delete, pull, push).
  • Streaming support: Integrates with Server-Sent Events (SSE) for real-time responses.
  • Image input: Allows processing images by encoding them as Base64 strings.
  • Customizable client: Supports configuring connection adapters, timeouts, and error handling.

Maintenance & Community

The README details development tasks and publishing procedures but does not list specific maintainers, community channels (like Discord/Slack), or a public roadmap.

Licensing & Compatibility

  • License: MIT License.
  • Compatibility: Compatible with standard Ruby environments. The MIT license permits commercial use, with a disclaimer of warranty and no assumption of responsibility for damages.

Limitations & Caveats

This gem provides low-level access and requires developers to build their own higher-level abstractions. It is dependent on a separately running Ollama service. Image processing requires manual Base64 encoding and may necessitate increased client timeouts.

Health Check
Last Commit

1 year ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
3 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.