NativeMindExtension  by NativeMindBrowser

On-device AI browser assistant for private workflows

Created 3 months ago
649 stars

Top 51.3% on SourcePulse

GitHubView on GitHub
Project Summary

NativeMind is an open-source browser extension that brings private, on-device AI capabilities to users, offering an alternative to cloud-based AI assistants like ChatGPT. It targets users concerned about data privacy and seeking to integrate AI into their browsing workflow for tasks like summarization, translation, and content enhancement, with the benefit of zero tracking and offline functionality.

How It Works

NativeMind integrates with local AI models primarily through Ollama, allowing users to run various LLMs directly on their machine. It also offers a trial experience via WebLLM for immediate, zero-setup AI interaction within the browser. This on-device approach ensures all data processing remains local, enhancing privacy and eliminating reliance on external servers.

Quick Start & Requirements

  • Installation: Install the extension from the Chrome Web Store or Firefox Add-ons. For full AI capabilities, set up Ollama locally.
  • Prerequisites: Ollama (recommended for advanced models), Node.js (v22.14.0+), PNPM (v10.10.0+). WebLLM requires a modern browser and sufficient RAM (8GB+ recommended).
  • Setup: Pin the extension, set up Ollama if desired, and select your AI model.
  • Links: Official Website, Discord Community.

Highlighted Details

  • Contextual awareness across tabs for continuous AI conversations.
  • AI-powered local search and smart page summarization.
  • Bilingual translation with side-by-side views.
  • Offline-first functionality after model download.

Maintenance & Community

The project actively encourages contributions and has a Discord community for support. Links to social media and a roadmap are available.

Licensing & Compatibility

Licensed under the GNU Affero General Public License v3.0 (AGPL v3), which mandates that derivative works also be open-sourced under the same license. This may have implications for commercial or closed-source integrations.

Limitations & Caveats

WebLLM functionality is described as a "quick trial" and may have performance limitations compared to Ollama. Features like "Chat with PDFs" and "Chat with Images" are marked as "coming soon." The AGPL v3 license requires derived works to be open-sourced, which could restrict certain commercial use cases.

Health Check
Last Commit

22 hours ago

Responsiveness

Inactive

Pull Requests (30d)
28
Issues (30d)
0
Star History
42 stars in the last 30 days

Explore Similar Projects

Starred by Sourabh Bajaj Sourabh Bajaj(Cofounder of Uplimit), Chip Huyen Chip Huyen(Author of "AI Engineering", "Designing Machine Learning Systems"), and
3 more.

NextChat by ChatGPTNextWeb

0.1%
86k
AI assistant for web, iOS, MacOS, Android, Linux, and Windows
Created 2 years ago
Updated 3 days ago
Feedback? Help us improve.