Discover and explore top open-source AI tools and projects—updated daily.
em-llmEpisodic memory architecture for unbounded LLM context
Top 99.6% on SourcePulse
Summary
EM-LLM tackles LLM context limitations by integrating human episodic memory principles, enabling virtually infinite context processing without fine-tuning. It offers researchers and practitioners efficient, human-like information retrieval over vast datasets.
How It Works
This architecture mimics human episodic memory and event cognition. It segments token sequences into "events" using Bayesian surprise and graph-theoretic refinement, then retrieves information via a two-stage similarity and temporal process. This approach achieves practically infinite context lengths efficiently.
Quick Start & Requirements
Install via pip install -r requirements.txt and pip install -e .. Configuration uses YAML files (config/) for parameters like chunk_size, model.path, memory buffers (n_init, n_local, n_mem), and offloading thresholds. Evaluation requires dataset downloads (scripts/download.sh) and running scripts/run.sh, supporting multiple LLMs and benchmarks. Significant resource management (memory, disk, multi-GPU) is implied.
Highlighted Details
EM-LLM outperforms SOTA retrieval models (InfLLM) and RAG on LongBench and $\infty$-Bench benchmarks. It achieves retrieval across 10 million tokens, infeasible for full-context models, while maintaining comparable resource usage to RAG. Event segmentation correlates strongly with human perception.
Maintenance & Community
The provided README lacks details on maintainers, community channels, sponsorships, or roadmaps.
Licensing & Compatibility
The README does not specify the software license or provide compatibility notes for commercial use.
Limitations & Caveats
As a research artifact (ICLR 2025), EM-LLM may require significant tuning for optimal performance and resource management (memory, disk offload). Its complex segmentation mechanisms introduce computational overhead, and production readiness is not explicitly stated.
10 months ago
Inactive