Local AI baby monitor
Top 73.7% on sourcepulse
This project provides a local, private-first AI Baby Monitor that uses video Large Language Models (LLMs) to detect safety rule violations in a video stream. It's designed for parents and caregivers seeking an additional layer of awareness, offering real-time alerts via a gentle beep when predefined rules are broken, all processed on consumer hardware.
How It Works
The system captures video frames from sources like webcams or RTSP cameras and pushes them to Redis. A separate watcher script retrieves recent frames, constructs prompts with natural language safety rules, and sends them to a local vLLM server (defaulting to Qwen2.5 VL via vLLM). The LLM analyzes the video and rules, returning structured JSON. If an alert condition is met, a beep is triggered, and a Streamlit dashboard displays the live feed and LLM reasoning.
Quick Start & Requirements
git clone https://github.com/zeenolife/ai-baby-monitor.git && cd ai-baby-monitor
cp .env.template .env
docker compose up --build -d
uv run scripts/run_watcher.py --config-file configs/living_room.yaml
http://localhost:8501
uv
.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The project is explicitly stated as an experimental hobby tool and not a replacement for adult supervision. Playing sound from within Docker containers to the host system is noted as difficult, requiring a separate host process for the watcher.
2 months ago
Inactive