This library provides a JavaScript interface for interacting with the Ollama API, enabling developers to easily integrate large language models into their web and Node.js applications. It simplifies tasks such as chat completion, text generation, model management, and embedding generation, offering a streamlined developer experience for LLM-powered features.
How It Works
The library acts as a client for the Ollama REST API, abstracting away HTTP requests and responses. It exposes methods that map directly to Ollama's functionalities, such as chat
, generate
, pull
, and embed
. The design supports both standard responses and streaming via AsyncGenerator
, allowing for real-time LLM output. It also offers browser compatibility and customization for the Ollama host and fetch implementation.
Quick Start & Requirements
npm i ollama
import ollama from 'ollama/browser'
Highlighted Details
AsyncGenerator
for real-time interaction.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The files
parameter for model creation is not currently supported. The abort
method throws an AbortError
on all listening streams, suggesting careful stream management or dedicated client instances per stream.
3 weeks ago
1 day