Discover and explore top open-source AI tools and projects—updated daily.
ochyaiOffline AI coding assistant powered by local LLMs
New!
Top 73.1% on SourcePulse
<2-3 sentences summarising what the project addresses and solves, the target audience, and the benefit.>
vibe-local offers a free, offline, zero-dependency AI coding environment, merging Ollama with a single-file Python agent. It targets educators, researchers, and students for understanding AI agent mechanics or practicing coding without internet or API keys, providing a transparent, accessible, and cost-free platform.
How It Works
The core vibe-coder.py agent is a self-contained Python script using only the standard library, eliminating external package dependencies. It communicates directly with a local Ollama instance for LLM inference. This architecture prioritizes simplicity, readability, and offline operation. The agent features 16 built-in tools, supports MCP integration, and includes advanced functionalities like Plan/Act modes and Git checkpoints, designed for educational transparency and research flexibility.
Quick Start & Requirements
Installation is a single command: curl -fsSL https://raw.githubusercontent.com/ochyai/vibe-local/main/install.sh | bash (or PowerShell equivalent). Prerequisites include Ollama installed and running locally, and Python 3.8+. Hardware requirements are significant, with RAM being critical; 16GB is recommended, 8GB minimum. NVIDIA GPUs are recommended for Windows/Linux.
Highlighted Details
vibe-coder.py) is a single file using only Python's standard library.Maintenance & Community
The provided README focuses on technical features and usage, not specific contributors, community channels, or a public roadmap.
Licensing & Compatibility
Components are permissively licensed: vibe-coder.py and vibe-local (MIT), Ollama (MIT), Qwen3 models (Apache 2.0). Compatible with research and education, with no explicit commercial use restrictions beyond standard open-source terms.
Limitations & Caveats
Local LLMs are less accurate than cloud-based ones and may unintentionally suggest dangerous commands. Users must exercise caution with commands like sudo or disk operations, and are advised to use the default "normal mode" for confirmation. Performance depends heavily on local hardware.
21 hours ago
Inactive