OpenOats  by yazinsai

AI meeting assistant for real-time notes and conversational intelligence

Created 1 month ago
2,249 stars

Top 19.7% on SourcePulse

GitHubView on GitHub
1 Expert Loves This Project
Project Summary

A meeting note-taker that talks back, OpenOats addresses the need for real-time assistance during calls by transcribing conversations locally and surfacing relevant information from a personal knowledge base. It targets power users and professionals on macOS who prioritize privacy and wish to enhance their preparedness during meetings without external data exposure. The primary benefit is intelligent, context-aware support delivered entirely on-device.

How It Works

OpenOats captures microphone and system audio, performing fully offline speech recognition directly on the user's Mac. It integrates with local LLMs via Ollama or cloud models through OpenRouter, and utilizes embedding models (Voyage AI, local Ollama, or OpenAI-compatible) to index a user-provided folder of notes. When a conversation reaches a critical juncture, OpenOats searches this knowledge base and surfaces relevant talking points in real-time, aiming to provide timely, contextually appropriate suggestions. This local-first, privacy-centric approach ensures no audio leaves the device during transcription.

Quick Start & Requirements

  • Install: Via Homebrew (brew tap yazinsai/openoats && brew install --cask yazinsai/openoats/openoats), downloading the latest DMG from the Releases page, or building from source (./scripts/build_swift_app.sh).
  • Prerequisites: Apple Silicon Mac, macOS 15+, Xcode 26 / Swift 6.2.
  • Configuration: Requires API keys for OpenRouter and Voyage AI for cloud features. For local mode, Ollama must be running with desired models. OpenAI-compatible embedding endpoints can also be configured.
  • Setup: Initial run downloads a ~600MB local speech model. Users must point the app to a folder containing .md or .txt files for their knowledge base.
  • Links: Homebrew tap: https://github.com/yazinsai/OpenOats

Highlighted Details

  • Fully offline transcription and LLM processing capabilities when using Ollama.
  • Real-time transcript viewing and one-click copying.
  • Knowledge base search powered by embeddings (Voyage AI, local, or OpenAI-compatible).
  • Default "invisible" mode hides the application window from screen sharing.
  • Zero network calls required when configured for fully local operation.

Maintenance & Community

No specific details regarding maintainers, community channels (e.g., Discord, Slack), or roadmap are provided in the README.

Licensing & Compatibility

Licensed under the MIT License, permitting commercial use and modification.

Limitations & Caveats

The software is restricted to Apple Silicon Macs running macOS 15+. Users bear sole responsibility for obtaining necessary consent for recording conversations, as the application provides no legal guidance and disclaims liability for non-compliant usage.

Health Check
Last Commit

1 day ago

Responsiveness

Inactive

Pull Requests (30d)
203
Issues (30d)
135
Star History
2,270 stars in the last 30 days

Explore Similar Projects

Starred by Tobi Lutke Tobi Lutke(Cofounder of Shopify), Chip Huyen Chip Huyen(Author of "AI Engineering", "Designing Machine Learning Systems"), and
9 more.

companion-app by a16z-infra

0.1%
6k
AI companion stack for personalized chatbots
Created 2 years ago
Updated 1 year ago
Feedback? Help us improve.