Discover and explore top open-source AI tools and projects—updated daily.
acon96Home Assistant integration for local LLM smart home control
Top 34.3% on SourcePulse
This project provides a local Large Language Model (LLM) integration for Home Assistant, enabling users to control their smart home devices via natural language. It targets Home Assistant users seeking a private, offline conversational assistant for home automation. The primary benefit is enabling voice control and complex automation sequences without relying on cloud services.
How It Works
The solution consists of a custom Home Assistant conversation agent and fine-tuned LLMs. The agent exposes local LLMs (via llama-cpp-python, Ollama, or text-generation-webui) as a conversation backend within Home Assistant. The "Home" LLMs are fine-tuned on a synthetic dataset designed for function calling, allowing them to interpret natural language commands and map them to Home Assistant services (e.g., light.turn_on). The models are quantized for low-resource environments like Raspberry Pis.
Quick Start & Requirements
llama-cpp-python within Home Assistant or remotely via Ollama, LocalAI, or text-generation-webui.Highlighted Details
acon96/Home-3B-v3-GGUF, acon96/Home-1B-v3-GGUF).text-generation-webui.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The project is primarily focused on English, with experimental support for other languages. The "Home" models are fine-tuned for specific tasks and may have limitations in general QA or complex reasoning outside of smart home control. The minimum Home Assistant version requirement is quite recent (2025.4.1).
2 days ago
1 day
abhishekkrthakur
johnbean393
gptme