Web interface for local LLM interaction via Ollama API
Top 35.2% on sourcepulse
This project provides a modern, privacy-focused web interface for interacting with local Large Language Models (LLMs) via the Ollama API. It's designed for users who want a clean, responsive graphical front-end to manage and chat with their locally hosted AI models, offering features like Markdown support and local chat history.
How It Works
The GUI is built with Vue.js and Vite, utilizing Tailwind CSS for styling and VueUse for composition utilities. It communicates directly with a running Ollama instance, leveraging IndexedDB for local chat history persistence. This client-side focus ensures all processing and data remain local, enhancing user privacy.
Quick Start & Requirements
git clone https://github.com/HelgeSverre/ollama-gui.git && cd ollama-gui && yarn install && yarn dev
ollama pull mistral
then ollama serve
OLLAMA_ORIGINS=https://ollama-gui.vercel.app ollama serve
.docker compose up -d
(access at http://localhost:8080
). GPU support requires uncommenting lines in compose.yml
.Highlighted Details
Maintenance & Community
The project is maintained by HelgeSverre. Further community links or roadmap details beyond the listed features are not explicitly provided in the README.
Licensing & Compatibility
Released under the MIT License, permitting commercial use and integration with closed-source projects.
Limitations & Caveats
The project is still under active development, with features like a model library browser and mobile responsiveness planned for future releases.
1 week ago
1 week