ollama-ai-provider  by sgomez

Vercel AI Provider for local LLM inference via Ollama

created 1 year ago
318 stars

Top 86.3% on sourcepulse

GitHubView on GitHub
1 Expert Loves This Project
Project Summary

This project provides a Vercel AI SDK provider for integrating local Large Language Models (LLMs) running via Ollama. It targets developers building applications with Vercel's AI SDK who want to leverage local LLMs for tasks like text generation, streaming, image input, and tool usage, offering a cost-effective and privacy-preserving alternative to cloud-based LLM APIs.

How It Works

The provider acts as a bridge between the Vercel AI SDK and the Ollama API. It abstracts the complexities of interacting with Ollama, allowing developers to use a familiar SDK interface. The core advantage is enabling local LLM execution, which enhances privacy, reduces latency, and eliminates external API costs. It supports features like text generation, streaming, and image input by mapping Vercel AI SDK calls to corresponding Ollama API endpoints.

Quick Start & Requirements

  • Install with: npm i ollama-ai-provider
  • Requires Ollama >= 0.5.0.
  • Supports models with visual understanding (e.g., llava, llava-llama3) for image input.
  • Official documentation and examples are available within the repository.

Highlighted Details

  • Supports text generation and streaming.
  • Image input is supported for vision-capable models.
  • Object generation is supported but noted as unstable with certain models due to Ollama limitations.
  • Tool usage is supported, though object-tool mode may have model-specific issues.
  • Tool streaming is experimental and simulated by segmenting complete responses.

Maintenance & Community

The project is maintained by sgomez. Further community or maintenance details are not explicitly provided in the README.

Licensing & Compatibility

The README does not specify a license. Compatibility for commercial use or closed-source linking is not detailed.

Limitations & Caveats

Object generation can be unstable and may fail with certain models. Tool streaming is simulated, not natively supported by Ollama. The project notes that some issues are inherent to the models or Ollama itself and may not be fully resolvable.

Health Check
Last commit

6 months ago

Responsiveness

1 day

Pull Requests (30d)
1
Issues (30d)
0
Star History
38 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.