Discover and explore top open-source AI tools and projects—updated daily.
ruzinLocal AI meeting summarization tool
Top 96.8% on SourcePulse
Summary
StenoAI addresses the need for private, AI-powered meeting notes by running entirely on the user's device. It offers local transcription via OpenAI Whisper and summarization using various locally hosted Small Language Models (SLMs) via Ollama, providing a privacy-first solution with no cloud dependencies or service costs.
How It Works
The project leverages OpenAI's Whisper AI for robust, local audio transcription. For summarization, it integrates with Ollama, allowing users to select from several SLMs (e.g., Llama 3.2:3b, Gemma 3:4b, Qwen 3:8b, Deepseek-R1:8b), each optimized for different use cases like speed, structured output, or reasoning. This architecture ensures all data processing remains on the user's machine, prioritizing user privacy and data security.
Quick Start & Requirements
The primary installation method for macOS users is downloading and opening a provided DMG file. For local development or advanced use, prerequisites include Python 3.8+, Node.js 18+, Homebrew, Ollama (with models like llama3.2:3b pulled), and ffmpeg. Performance is significantly better on Apple Silicon Macs compared to Intel Macs due to hardware acceleration.
Highlighted Details
Maintenance & Community
The project is maintained by Ruzin, with a Twitter handle @ruzin linked for updates. No other community channels (e.g., Discord, Slack) or detailed roadmap information are provided in the README.
Licensing & Compatibility
StenoAI is licensed under CC BY-NC 4.0 (Creative Commons Attribution-NonCommercial 4.0 International). This license permits free use for personal, non-commercial purposes but explicitly restricts commercial application.
Limitations & Caveats
Performance on Intel Macs is notably limited due to the absence of dedicated AI inference hardware. The non-commercial clause in the CC BY-NC 4.0 license restricts its adoption in business or commercial contexts.
2 days ago
Inactive