localgpt  by localgpt-app

Local AI assistant for private, offline operation

Created 3 weeks ago

New!

995 stars

Top 37.2% on SourcePulse

GitHubView on GitHub
Project Summary

Summary

LocalGPT is a Rust-based AI assistant engineered for local, private operation on user devices, addressing data privacy concerns and the need for offline AI capabilities. It targets power users and developers seeking a self-contained, high-performance assistant that avoids cloud dependencies. The primary benefit is a fully functional AI assistant running entirely on the user's machine, with persistent memory and autonomous task execution.

How It Works

LocalGPT utilizes plain Markdown files (.md) as its core memory store, organizing long-term knowledge, autonomous task queues (HEARTBEAT.md), and personality guidance (SOUL.md). It indexes these files using SQLite with FTS5 for fast keyword search and sqlite-vec for semantic search powered by local embeddings, enabling efficient retrieval. The application is built on a Rust, Tokio, and Axum stack, designed for high performance and minimal resource footprint, culminating in a single, self-contained binary.

Quick Start & Requirements

  • Installation: Install via Cargo: cargo install localgpt. For a headless version (no desktop GUI), use cargo install localgpt --no-default-features.
  • Prerequisites: Rust toolchain (for Cargo installation). LLM provider API keys (e.g., Anthropic, OpenAI) are required unless using local models via Ollama.
  • Setup: Initialize configuration with localgpt config init.
  • Running: Start interactive chat with localgpt chat or ask a single question with localgpt ask "question". Run as a daemon with localgpt daemon start.
  • Documentation: Blog post available: Why I Built LocalGPT in 4 Nights

Highlighted Details

  • Single Binary: Distributes as a single, self-contained executable, eliminating the need for Node.js, Docker, or Python environments.
  • Local-First Design: All data and processing remain on the user's machine, ensuring data privacy.
  • Persistent Memory: Leverages Markdown files for knowledge storage, compatible with OpenClaw formats.
  • Autonomous Tasks: Features a background "heartbeat" for delegating and executing tasks autonomously.
  • Multiple Interfaces: Supports CLI, a web UI, a desktop GUI, and a Telegram bot for interaction.
  • LLM Flexibility: Integrates with multiple LLM providers including Anthropic (Claude), OpenAI, and Ollama.

Maintenance & Community

The project is open-source with contributors listed and a significant number of stargazers. A blog post details the development story. No specific community channels (like Discord/Slack) or roadmap links are provided in the README.

Licensing & Compatibility

Licensed under the Apache-2.0 license. This license is permissive and generally compatible with commercial use and linking within closed-source projects.

Limitations & Caveats

The "local device focused" nature implies potential scalability limitations for enterprise-level, distributed deployments without significant architectural changes. Configuration requires managing API keys for external LLM providers if local models are not used.

Health Check
Last Commit

22 hours ago

Responsiveness

Inactive

Pull Requests (30d)
45
Issues (30d)
31
Star History
1,002 stars in the last 24 days

Explore Similar Projects

Feedback? Help us improve.