Chat UI: open-source interface for LLMs
Top 5.7% on sourcepulse
This project provides an open-source, SvelteKit-based chat interface designed to power applications like HuggingChat. It enables users to interact with various open-source and commercial Large Language Models (LLMs), offering features like web search integration and customizable model configurations.
How It Works
The UI connects to LLMs via API endpoints, supporting numerous providers including OpenAI-compatible services, Anthropic, Google Vertex AI, Cloudflare, and local inference servers like llama.cpp and Ollama. It leverages a MongoDB instance for storing chat history. A key feature is its integrated web search capability, which uses text embeddings to retrieve relevant information for Retrieval-Augmented Generation (RAG).
Quick Start & Requirements
docker run -p 3000 -e HF_TOKEN=hf_*** -v db:/data ghcr.io/huggingface/chat-ui-db:latest
llama.cpp
for local LLM execution.Highlighted Details
.env.local
.Maintenance & Community
The project is actively maintained by Hugging Face. Community support channels are not explicitly mentioned in the README.
Licensing & Compatibility
The project is licensed under the AGPL-3.0 license. This is a strong copyleft license, requiring derivative works to also be open-sourced under AGPL-3.0.
Limitations & Caveats
The AGPL-3.0 license may impose restrictions on commercial use or linking with closed-source applications. Running HuggingChat locally requires significant setup, including a Hugging Face Pro account for some models and API keys for services like Serper.dev.
1 day ago
1 day