Web UI for local LLM interaction using Ollama
Top 31.4% on sourcepulse
This project provides a fully-featured, responsive web interface for interacting with Ollama Large Language Models, targeting users who want a quick and easy way to get started with LLMs locally or offline. It offers a ChatGPT-like user experience with features like code highlighting, chat history, and model management directly from the UI.
How It Works
The interface is built with Next.js and leverages shadcn-ui for a clean, modern UI. It communicates with a running Ollama instance via its API, allowing users to download, delete, and switch between models. Chats are stored locally in localStorage
, eliminating the need for a separate database and enhancing privacy and ease of use.
Quick Start & Requirements
docker run -d -p 8080:3000 --add-host=host.docker.internal:host-gateway -e OLLAMA_URL=http://host.docker.internal:11434 --name nextjs-ollama-ui --restart always jakobhoeg/nextjs-ollama-ui:latest
npm install
, npm run dev
.Highlighted Details
Maintenance & Community
This is a hobby project. Links to community resources are not provided in the README.
Licensing & Compatibility
The project does not explicitly state a license in the README.
Limitations & Caveats
The project is a hobby project, and a more complete experience may be found elsewhere. Upcoming features are listed but not yet implemented.
1 month ago
1 day