vibe-local  by ochyai

Offline AI coding assistant powered by local LLMs

Created 3 days ago

New!

395 stars

Top 73.1% on SourcePulse

GitHubView on GitHub
Project Summary

<2-3 sentences summarising what the project addresses and solves, the target audience, and the benefit.> vibe-local offers a free, offline, zero-dependency AI coding environment, merging Ollama with a single-file Python agent. It targets educators, researchers, and students for understanding AI agent mechanics or practicing coding without internet or API keys, providing a transparent, accessible, and cost-free platform.

How It Works

The core vibe-coder.py agent is a self-contained Python script using only the standard library, eliminating external package dependencies. It communicates directly with a local Ollama instance for LLM inference. This architecture prioritizes simplicity, readability, and offline operation. The agent features 16 built-in tools, supports MCP integration, and includes advanced functionalities like Plan/Act modes and Git checkpoints, designed for educational transparency and research flexibility.

Quick Start & Requirements

Installation is a single command: curl -fsSL https://raw.githubusercontent.com/ochyai/vibe-local/main/install.sh | bash (or PowerShell equivalent). Prerequisites include Ollama installed and running locally, and Python 3.8+. Hardware requirements are significant, with RAM being critical; 16GB is recommended, 8GB minimum. NVIDIA GPUs are recommended for Windows/Linux.

Highlighted Details

  • Zero External Dependencies: Core agent (vibe-coder.py) is a single file using only Python's standard library.
  • Fully Offline Capable: Designed for use without internet, ideal for workshops.
  • Rich Feature Set: Includes 16 built-in tools, MCP integration, Plan/Act modes, Git checkpoints, auto-test loop, file watcher, parallel agents, and a fixed footer TUI.
  • Educational Design: Human-readable source code facilitates understanding of AI agent internals.

Maintenance & Community

The provided README focuses on technical features and usage, not specific contributors, community channels, or a public roadmap.

Licensing & Compatibility

Components are permissively licensed: vibe-coder.py and vibe-local (MIT), Ollama (MIT), Qwen3 models (Apache 2.0). Compatible with research and education, with no explicit commercial use restrictions beyond standard open-source terms.

Limitations & Caveats

Local LLMs are less accurate than cloud-based ones and may unintentionally suggest dangerous commands. Users must exercise caution with commands like sudo or disk operations, and are advised to use the default "normal mode" for confirmation. Performance depends heavily on local hardware.

Health Check
Last Commit

21 hours ago

Responsiveness

Inactive

Pull Requests (30d)
6
Issues (30d)
4
Star History
421 stars in the last 3 days

Explore Similar Projects

Feedback? Help us improve.