Discover and explore top open-source AI tools and projects—updated daily.
brainqub3Recursive Language Models for extended context processing
Top 80.5% on SourcePulse
Summary
This repository offers a minimal Recursive Language Model (RLM) scaffold using Claude Code. It tackles the challenge of processing inputs exceeding LLM context windows by enabling programmatic decomposition and recursive self-calling over document chunks. This benefits Claude Code users requiring analysis of extensive text data.
How It Works
The architecture mirrors the RLM paper: a root LLM (Claude Opus 4.5) orchestrates sub-LLM calls (Claude Haiku) via a persistent Python REPL. This REPL functions as the external environment, managing state and providing utilities for chunking and searching. This approach allows recursive processing of inputs significantly larger than standard context limits.
Quick Start & Requirements
git clone https://github.com/Brainqub3/claude_code_RLM.git) and cd into it.claude), then execute the RLM skill (/rlm). Follow prompts for context file path and query.context/ directory.Highlighted Details
Maintenance & Community
No specific details regarding contributors, sponsorships, community channels, or roadmaps are present in the README.
Licensing & Compatibility
The README directs users to a LICENSE file for details. The specific license type and implications for commercial use or closed-source integration are not explicitly stated.
Limitations & Caveats
This is a minimal implementation, not production-ready. Use of Claude Code's --dangerously-skip-permissions mode necessitates careful setup, isolated environments, and awareness of risks due to unconfirmed command execution.
1 month ago
Inactive
seal-rg
NVIDIA
mlc-ai
chonkie-inc