llm-x  by mrdjohnson

Local LLM UI for web/mobile, supporting multiple models

created 1 year ago
262 stars

Top 97.8% on sourcepulse

GitHubView on GitHub
1 Expert Loves This Project
Project Summary

LLM-X provides a user-friendly, offline-first web and Chrome extension interface for interacting with local Large Language Models (LLMs). It targets users who run LLM inference locally via tools like Ollama, LM Studio, or AUTOMATIC1111, offering a private and convenient way to chat, generate images, and manage conversations.

How It Works

LLM-X leverages a React/TypeScript frontend with Vite for building a Progressive Web App (PWA) and a Chrome extension. It connects to local LLM servers via their APIs, supporting multiple simultaneous connections and models. The application prioritizes privacy by performing all operations locally, with no external API calls, and stores chat history in IndexedDB for offline access.

Quick Start & Requirements

  • Web Client: Run OLLAMA_ORIGINS=https://mrdjohnson.github.io ollama serve (for Ollama) or lms server start --cors=true (for LM Studio). Then navigate to https://mrdjohnson.github.io.
  • Chrome Extension: Install from Chrome Web Store or build locally. For Ollama, set OLLAMA_ORIGINS=chrome-extension://iodcdhcpahifeligoegcmcdibdkffclk.
  • Prerequisites: Ollama, LM Studio, or AUTOMATIC1111. Gemini Nano requires Chrome Canary with specific flags enabled.
  • Setup: Minimal setup for web client; Chrome extension may require Ollama origin configuration.

Highlighted Details

  • Supports Ollama, LM Studio, OpenAI servers, Gemini Nano, and AUTOMATIC1111.
  • Features multi-model chat, image generation/analysis, code highlighting, and a command bar (kbar).
  • Includes chat history management, system prompt customization ("Personas"), and import/export functionality.
  • Progressive Web App (PWA) support allows for offline use and installation.

Maintenance & Community

The project is actively maintained by mrdjohnson. Continuous deployment is set up, with changes to the master branch automatically deploying to https://mrdjohnson.github.io/llm-x/.

Licensing & Compatibility

The project appears to be MIT licensed, allowing for commercial use and integration with closed-source applications.

Limitations & Caveats

The README mentions initial difficulties with LangChain.js stream handling, though it has since been integrated. Code highlighting is limited to common languages.

Health Check
Last commit

1 day ago

Responsiveness

1 day

Pull Requests (30d)
7
Issues (30d)
8
Star History
25 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.