Atomic-Chat  by AtomicBot-ai

Local AI chat and assistant platform

Created 1 week ago

New!

410 stars

Top 71.2% on SourcePulse

GitHubView on GitHub
Project Summary

Atomic-Chat offers an open-source, privacy-focused alternative to ChatGPT, designed to run entirely offline on a user's computer. It targets users who prioritize data privacy and developers seeking local LLM integration, providing a flexible interface that supports both local and cloud-based AI models. The primary benefit is achieving ChatGPT-like functionality with complete user control and offline capabilities.

How It Works

The application leverages desktop application frameworks like Tauri to provide a native interface. It allows users to download and run various open-source Large Language Models (LLMs) such as Llama, Gemma, and Qwen directly on their hardware. Alternatively, it can seamlessly connect to popular cloud-based LLM providers including OpenAI, Anthropic, and Mistral. A key architectural feature is its OpenAI-compatible API, enabling local integration with other applications, and support for the Model Context Protocol (MCP) to facilitate agentic workflows.

Quick Start & Requirements

  • Primary install/run command: make dev (handles dependencies, build, and launch).
  • Prerequisites: Node.js ≥ 20.0.0, Yarn ≥ 4.5.3, Make ≥ 3.81, Rust (for Tauri), MetalToolchain (for Apple Silicon).
  • System Requirements: macOS 13.6+; RAM requirements vary by model size: 8GB for 3B models, 16GB for 7B, and 32GB for 13B models.
  • Links: Download/Info: atomic.chat, GitHub Releases. Community: Discord, X/Twitter.

Highlighted Details

  • Local LLM Support: Download and run models from HuggingFace (Llama, Gemma, Qwen, etc.).
  • Cloud Integration: Connect to OpenAI, Anthropic, Mistral, Groq, MiniMax, and others.
  • Custom Assistants: Create specialized AI assistants tailored for specific tasks.
  • OpenAI-Compatible API: Provides a local server at localhost:1337 for seamless integration with other tools.
  • Model Context Protocol (MCP): Enables advanced agentic capabilities.

Maintenance & Community

The project maintains active community channels on Discord for help and general discussion, and on X/Twitter for updates. Bug reports are managed via GitHub Issues.

Licensing & Compatibility

The project is licensed under the Apache 2.0 license, which generally permits commercial use and integration into closed-source projects, subject to the license terms.

Limitations & Caveats

The current documentation and build process appear primarily focused on macOS. Running larger LLMs locally requires substantial RAM, potentially exceeding standard consumer hardware configurations.

Health Check
Last Commit

1 day ago

Responsiveness

Inactive

Pull Requests (30d)
2
Issues (30d)
0
Star History
413 stars in the last 11 days

Explore Similar Projects

Feedback? Help us improve.