macOS app for local LLM chat using files, folders, and websites
Top 16.3% on sourcepulse
Sidekick is a native macOS application designed for local, private, and context-aware AI interactions. It empowers users to chat with local Large Language Models (LLMs) that can access and synthesize information from files, folders, and websites on their Mac, all offline. This tool is ideal for students, researchers, and professionals seeking to leverage AI for document analysis, research, and content creation without compromising data privacy or requiring external API keys.
How It Works
Sidekick utilizes Retrieval Augmented Generation (RAG) to process user queries against a configured set of "experts" (collections of files, folders, or websites). It leverages llama.cpp
for efficient local LLM inference, particularly optimized for Apple Silicon. The application supports function calling for enhanced reasoning and task execution, memory for personalized interactions, and features like an inline writing assistant, AI content detection, diagram generation, and presentation creation. It also integrates CoreML for on-device image generation (requiring macOS 15.2+ and Apple Intelligence).
Quick Start & Requirements
Highlighted Details
Maintenance & Community
The project is maintained by johnbean393. Contributions are welcome. Contact is available via email (johnbean393@gmail.com) or by filing an issue.
Licensing & Compatibility
The repository does not explicitly state a license in the provided README. Compatibility for commercial use or closed-source linking is not specified.
Limitations & Caveats
Image generation requires macOS 15.2 or above and Apple Intelligence. The project's licensing status is unclear, which may impact commercial adoption. Developer setup requires specific signing identities and Xcode.
4 days ago
1 day