llm-ollama  by taketwo

LLM plugin for accessing models running on an Ollama server

created 1 year ago
324 stars

Top 85.2% on sourcepulse

GitHubView on GitHub
Project Summary

This plugin provides seamless integration between the llm CLI tool and Ollama, enabling users to leverage local LLM models for prompting, chatting, embeddings, and structured output generation. It targets developers and power users who want to utilize Ollama's extensive model library within the llm ecosystem.

How It Works

The plugin acts as a bridge, querying the Ollama server for available models and registering them with the llm CLI. It supports various Ollama features, including multi-modal image inputs, embedding generation, and structured JSON output via schemas. For asynchronous operations, it provides access to async models for use with Python's asyncio.

Quick Start & Requirements

  • Install via llm install llm-ollama.
  • Requires a running Ollama server with pulled models.
  • Supports multi-modal models (e.g., llava) and embedding models.
  • Ollama server address can be configured via OLLAMA_HOST environment variable.
  • Development setup instructions are provided for local testing and linting.

Highlighted Details

  • Automatic alias creation for models with :latest tags.
  • Supports image attachments for multi-modal models.
  • Handles both regular and specialized embedding models.
  • Enables structured output generation using Ollama's schema support.
  • Provides async model access for asyncio integration.
  • Allows passing Ollama modelfile parameters as options (e.g., -o temperature 0.8).

Maintenance & Community

No specific contributors, sponsorships, or community links (Discord/Slack) are mentioned in the README.

Licensing & Compatibility

The README does not explicitly state the license type. Compatibility for commercial use or closed-source linking is not specified.

Limitations & Caveats

The README does not detail any specific limitations, known bugs, or deprecation notices. The project appears to be actively maintained for integration with the llm CLI.

Health Check
Last commit

5 days ago

Responsiveness

1 day

Pull Requests (30d)
1
Issues (30d)
3
Star History
45 stars in the last 90 days

Explore Similar Projects

Starred by John Resig John Resig(Author of jQuery; Chief Software Architect at Khan Academy), Shawn Wang Shawn Wang(Editor of Latent Space), and
3 more.

ollama-js by ollama

0.4%
4k
JS SDK for Ollama
created 1 year ago
updated 3 weeks ago
Feedback? Help us improve.