WitNote  by hooosberg

Local-first AI writing companion for cross-platform productivity

Created 3 weeks ago

New!

405 stars

Top 71.8% on SourcePulse

GitHubView on GitHub
Project Summary

A local-first AI writing companion for macOS and Windows, WitNote offers a privacy-focused, flexible solution for content creation. It targets users who require robust AI assistance without continuous cloud dependency, providing a lightweight, intelligent tool with an ultra-minimalist interface. The primary benefit is user data sovereignty combined with versatile AI engine choices.

How It Works

WitNote employs a local-first architecture, ensuring all user data and AI processing (when using local engines) remain on the user's device. It uniquely integrates three AI engine options: WebLLM for lightweight, offline use (macOS only); Ollama for powerful, offline local models; and Cloud API for external, high-capacity services. This approach offers flexibility and strong privacy guarantees, allowing users to choose intelligence sources based on performance needs and data sensitivity, all managed through a simple, native card-based interface.

Quick Start & Requirements

  • macOS Installation: Download the .dmg file and drag the application to the Applications folder.
  • Windows Installation: Download and run the .exe installer.
  • Prerequisites:
    • macOS: macOS 12.0+ (13.0+ recommended). Crucially, requires Apple Silicon (M1/M2/M3/M4) chips. 16GB+ RAM recommended.
    • Windows: Windows 10 (64-bit) (11 recommended). Intel Core i5 / AMD Ryzen 5 (i7/Ryzen 7 recommended). 8GB RAM (16GB+ recommended). SSD storage, Discrete GPU with Vulkan support recommended.
  • Links: GitHub Repository

Highlighted Details

  • Local-First Notes: Supports .txt and .md files, with any folder configurable as the notes vault.
  • Triple AI Engine Support: Seamlessly switch between WebLLM (lightweight, offline, macOS), Ollama (powerful local models, offline), and Cloud API (OpenAI-compatible).
  • Privacy Focused: 100% local storage for notes; AI inference is local when using local engines, with Apple Notarization for security.
  • Minimalist Interface: Features an iOS-style card grid view with drag-and-drop organization and an optional distraction-free focus mode.
  • Customizable AI: Includes a library of 10+ role prompts and supports custom system prompts for tailored AI assistants.
  • Multi-language Support: Adapts UI and AI responses across 8 languages.

Maintenance & Community

The project is maintained by hooosberg. Contact is available via zikedece@proton.me. The repository is hosted on GitHub. No specific community channels (like Discord/Slack) or detailed roadmap are provided in the README.

Licensing & Compatibility

The project is released under the MIT License, which is permissive for commercial use and integration into closed-source projects. However, the strict Apple Silicon requirement for macOS significantly limits hardware compatibility. The Windows version is noted as newly released.

Limitations & Caveats

The macOS version explicitly does not support Intel-based Macs, citing architectural incompatibility and lack of hardware acceleration, which results in extremely poor performance. The Windows version, being newly released, may encounter installation hurdles such as Windows SmartScreen or antivirus warnings due to its individual developer signing certificate. The WebLLM engine is exclusive to macOS.

Health Check
Last Commit

16 hours ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
12
Star History
409 stars in the last 25 days

Explore Similar Projects

Feedback? Help us improve.