meeting-minutes  by Zackriya-Solutions

Local AI meeting assistant for real-time transcription and summarization

created 7 months ago
6,793 stars

Top 7.6% on sourcepulse

GitHubView on GitHub
Project Summary

Meetily is an open-source, self-hosted AI assistant designed for live meeting note-taking and summary generation. It targets individuals and teams seeking a privacy-first, cost-effective solution to automate meeting documentation, allowing users to focus on discussions rather than manual note-taking.

How It Works

Meetily captures live audio from microphones and system audio, transcribes it in real-time using Whisper.cpp, and generates summaries. It prioritizes local processing for enhanced user privacy and offers flexibility by working offline and supporting various meeting platforms. The architecture includes an Audio Capture Service, a Transcription Engine (Whisper.cpp), an LLM Orchestrator for diverse AI models, and data services like ChromaDB for semantic search and SQLite for metadata.

Quick Start & Requirements

  • Installation: Download packaged executables from the releases page for Windows (.exe, .msi) or macOS (.dmg). For development, clone the repository and set up frontend and backend environments.
  • Prerequisites: Node.js 18+, Python 3.10+, FFmpeg, Rust 1.65+ (experimental), CMake 3.22+ (frontend build). Windows requires Visual Studio Build Tools with C++ workload.
  • Setup: Packaged executables offer a straightforward setup. Development setup involves cloning, virtual environment creation, dependency installation, and building Whisper.cpp.
  • Links: Website, Demo Video

Highlighted Details

  • Privacy-first: All processing occurs locally on the user's device.
  • Cost-effective: Leverages open-source AI models, avoiding expensive API calls.
  • Flexible: Supports offline operation and customization for self-hosting.
  • LLM Integration: Supports Anthropic, Groq, and Ollama via a unified interface.

Maintenance & Community

The project is actively maintained, with recent updates including Windows support and improved error handling. Community engagement is encouraged via a Discord channel.

Licensing & Compatibility

MIT License. Permissive for commercial use and closed-source linking.

Limitations & Caveats

Summarization quality can be poor with smaller LLMs; models with 32B+ parameters are recommended. The backend build process requires significant development tooling (CMake, C++ compiler, Rust) which can be complex for users. Linux support is planned but not yet implemented.

Health Check
Last commit

1 day ago

Responsiveness

1 day

Pull Requests (30d)
3
Issues (30d)
4
Star History
1,752 stars in the last 90 days

Explore Similar Projects

Starred by Addy Osmani Addy Osmani(Engineering Leader on Google Chrome), Victor Taelin Victor Taelin(Author of Bend, Kind, HVM), and
1 more.

chatbox by chatboxai

0.3%
36k
Desktop client app for AI models/LLMs
created 2 years ago
updated 6 days ago
Feedback? Help us improve.