dispatch  by bassimeledath

Expand AI context windows with parallel background workers

Created 1 month ago
319 stars

Top 85.1% on SourcePulse

GitHubView on GitHub
Project Summary

A Claude Code skill, /dispatch addresses the limitation of fixed AI context windows by enabling users to delegate complex, multi-step tasks to background AI workers. This transforms the main Claude Code session into a lean orchestrator, allowing users to manage extensive workflows without filling their primary context. It benefits power users and engineers by enabling parallel execution of tasks across multiple AI agents, each with a fresh, full context window, thereby significantly increasing effective context capacity and reducing cognitive load on the user.

How It Works

The core design inverts the traditional AI interaction model: the main session acts as a mediator, not the primary executor. Upon receiving a /dispatch command, it generates a checklist plan. This plan is then handed off to background worker agents, each operating within its own dedicated, full context window. These workers execute tasks, manage their progress, and crucially, can ask clarifying questions directly to the user via the dispatcher if they encounter issues, preventing silent failures and context loss. This architecture ensures the main session remains lean and responsive, while the actual reasoning and implementation are distributed to specialized workers that can leverage diverse AI models (e.g., Claude, GPT, Gemini) for optimal performance on specific sub-tasks.

Quick Start & Requirements

  • Installation:
    • User-level: npx skills add bassimeledath/dispatch -g
    • Project-level: npx skills add bassimeledath/dispatch
  • Prerequisites:
    • Host Session: Claude Code (claude).
    • Worker CLIs (optional, for multi-model dispatch): Cursor CLI (agent), Codex CLI (codex), or any CLI accepting a prompt argument.
  • Configuration: Auto-detects CLIs and models, generating ~/.dispatch/config.yaml. Running /dispatch with no arguments at session start is recommended to pre-load configuration.
  • Links: Installation commands serve as the primary guide.

Highlighted Details

  • Extended Context: Effectively multiplies the usable context window by dispatching tasks to independent worker agents.
  • Orchestration Focus: Frees the main session to act solely as a mediator, preserving its context for high-level task management.
  • Interactive Problem Solving: Workers surface clarifying questions to the user when stuck, enabling seamless continuation without re-explanation.
  • Non-blocking Workflow: Dispatching tasks immediately returns control to the user's session.
  • Model Agnosticism: Supports mixing various AI models (e.g., Opus, Sonnet, Gemini) within a single dispatch command.

Maintenance & Community

No specific details regarding contributors, sponsorships, or community channels (like Discord/Slack) are provided in the README.

Licensing & Compatibility

  • License: MIT.
  • Compatibility: The MIT license permits commercial use and integration with closed-source projects.

Limitations & Caveats

The tool requires Claude Code as the host environment. While worker CLIs are optional, they are necessary for leveraging multi-model dispatch capabilities. The functionality is presented as a "skill" within the Claude Code ecosystem.

Health Check
Last Commit

1 week ago

Responsiveness

Inactive

Pull Requests (30d)
5
Issues (30d)
2
Star History
315 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.