hollama  by fmaclen

Web UI chat app for local LLM inference

created 1 year ago
939 stars

Top 39.8% on sourcepulse

GitHubView on GitHub
1 Expert Loves This Project
Project Summary

Hollama is a minimal, browser-based chat application designed for interacting with Large Language Models (LLMs) served via Ollama or OpenAI APIs. It targets users who want a lightweight, client-side interface for LLM experimentation and deployment, offering features like multi-server support, markdown rendering, and local data storage.

How It Works

Hollama is built as a client-side web application, meaning all processing and data storage occur within the user's browser. This architecture eliminates the need for a backend server for the chat interface itself, simplifying deployment and enhancing privacy. It leverages standard web technologies to communicate with Ollama or OpenAI endpoints.

Quick Start & Requirements

  • Install: Download pre-built binaries for macOS, Windows, and Linux, or self-host using Docker.
  • Prerequisites: Ollama or an OpenAI-compatible API endpoint.
  • Demo: https://hollama.dev/
  • Downloads: https://hollama.dev/

Highlighted Details

  • Supports both Ollama and OpenAI API endpoints.
  • Features multi-server management for diverse LLM access.
  • Includes advanced UI features like markdown rendering with syntax highlighting, KaTeX support, and code editor functionalities.
  • Offers local data storage, customizable system prompts, and message editing/retrying.

Maintenance & Community

Licensing & Compatibility

  • License: MIT License. Permissive for commercial use and integration with closed-source applications.

Limitations & Caveats

The application is primarily client-side, meaning complex LLM operations or large model interactions might be constrained by browser performance and local machine resources.

Health Check
Last commit

1 week ago

Responsiveness

1 day

Pull Requests (30d)
1
Issues (30d)
4
Star History
122 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.