Discover and explore top open-source AI tools and projects—updated daily.
win4rLossless context management for LLM agents
New!
Top 61.0% on SourcePulse
Summary
This project addresses critical inaccuracies in OpenClaw's lossless context management, specifically its token estimation for CJK languages and core reliability bugs. The lossless-claw-enhanced fork introduces CJK-aware token estimation, improving accuracy by up to 6x, and integrates essential upstream bug fixes. This ensures context window integrity, prevents data loss, and enhances agent reliability, particularly for users working with multilingual content.
How It Works
The enhancement replaces OpenClaw's default sliding-window compaction with a DAG-based summarization system. It persists all conversation messages in a SQLite database, summarizes older message chunks, and then condenses these summaries into higher-level nodes, forming a directed acyclic graph. A novel CJK-aware token estimation module replaces the upstream's ASCII-centric calculation, accurately mapping CJK characters to tokens (1.5 tokens/char vs. upstream's 0.25 tokens/char). Critical upstream bug fixes related to authentication errors, session rotation detection, and handling of empty/aborted messages are cherry-picked for improved production stability.
Quick Start & Requirements
git clone https://github.com/win4r/lossless-claw-enhanced.git && openclaw plugins install --link ./lossless-claw-enhanced. Alternatively, use copy install: openclaw plugins install ./lossless-claw-enhanced.contextEngine to lossless-claw in OpenClaw's plugin configuration.openclaw gateway restart.Highlighted Details
lcm_grep, lcm_describe, and lcm_expand tools for agents to query and retrieve historical context.Maintenance & Community
This project tracks the Martian-Engineering/lossless-claw main branch for upstream synchronization. No explicit community channels or contributor details are provided within the README.
Licensing & Compatibility
Limitations & Caveats
The README does not detail specific limitations, alpha status, or known bugs. Functionality relies on an external LLM for summarization, introducing potential latency and operational costs.
2 days ago
Inactive