ralph  by iannuttall

File-based agent loop drives autonomous coding

Created 2 weeks ago

New!

639 stars

Top 52.0% on SourcePulse

GitHubView on GitHub
1 Expert Loves This Project
Project Summary

Summary

Ralph is a minimal, file-based agent loop for autonomous coding, targeting developers and researchers. It streamlines AI-driven code generation by treating files and Git as persistent memory, enabling iterative development with fresh, on-disk states and reduced model context reliance.

How It Works

Ralph operates iteratively, processing one "story" at a time with state persisted in .ralph/. A JSON PRD defines tasks, and each loop iteration reads on-disk state, executes, and commits results. This file-centric approach leverages Git as memory, offering a novel method for managing autonomous coding workflows, ensuring reproducibility and minimizing computational overhead from large model contexts.

Quick Start & Requirements

Install globally: npm i -g @iannuttall/ralph. Launch prompt: ralph prd. Run iteration: ralph build 1. Dry run: ralph build 1 --no-commit. Prerequisites: Requires separate installation and configuration of agent CLIs (Codex, Claude, Droid, OpenCode). Agent selection is configurable via AGENT_CMD in .agents/ralph/config.sh. Optional template installation: ralph install. Optional skill installation: ralph install --skills. Setup time depends on agent setup.

Highlighted Details

  • File-based Memory: Uses project files and Git for state (.ralph/, .agents/ralph/).
  • Agent Agnosticism: Supports multiple LLM agents (Codex, Claude, Droid, OpenCode) via AGENT_CMD.
  • Iterative Story Processing: Executes tasks sequentially, updating story status (open, in_progress, done).
  • Customization: Overrides prompts/behavior via .agents/ralph/ templates.
  • Stale Story Handling: Configurable STALE_SECONDS for reopening stalled stories.

Maintenance & Community

The README provides no details on contributors, sponsorships, or community channels (e.g., Discord, Slack).

Licensing & Compatibility

The license type is not specified in the README, precluding assessment of commercial use or closed-source linking compatibility.

Limitations & Caveats

Functionality critically depends on separate installation and correct configuration of external LLM agent CLIs. Setup complexity is tied to these third-party dependencies. The project's evolving nature is implied by its minimal setup focus.

Health Check
Last Commit

2 weeks ago

Responsiveness

Inactive

Pull Requests (30d)
7
Issues (30d)
6
Star History
648 stars in the last 15 days

Explore Similar Projects

Feedback? Help us improve.