React UI for local LLM interaction
Top 83.8% on sourcepulse
This project provides a minimalistic, offline-capable React UI for interacting with local Large Language Models (LLMs) via Ollama. It targets users who want an improved chatbot experience for local models, offering features like model toggling, conversation history, and prompt templating.
How It Works
The UI is built with React, Next.js, and Tailwind CSS for a modern frontend. It leverages LangchainJs and Ollama to connect to and manage local LLMs. Conversations and context are stored in memory for seamless model switching, and a local database persists chat history.
Quick Start & Requirements
npm install
npm run dev
http://localhost:11434
)..env.local
.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The project is under active development with several features planned, including message editing, image uploads for multimodal models, conversation summarization, and desktop app conversion.
10 months ago
1 day