Discover and explore top open-source AI tools and projects—updated daily.
kitfunsoIntelligent memory lifecycle for AI agents
New!
Top 65.8% on SourcePulse
AI agents often suffer from a lack of persistent memory, forgetting context between sessions and tools. Hippo addresses this by providing a biologically-inspired, zero-dependency memory system that implements decay, retrieval strengthening, and consolidation. It acts as a shared memory layer, enabling AI agents to retain knowledge across different tools and sessions, benefiting multi-tool developers and teams by preventing repeated mistakes and organizing information effectively.
How It Works
Hippo models memory on the human hippocampus, utilizing a three-tiered system: a volatile buffer for current session data, an episodic store for timestamped memories, and a semantic store for consolidated patterns. New information enters the buffer, then gets encoded into episodic memory with assigned tags, strength, and a default half-life. During a "sleep" consolidation phase, repeated episodes are compressed into stable semantic patterns. Decay is default; memories fade unless retrieved, and retrieval strengthens them, mimicking biological memory lifecycle principles.
Quick Start & Requirements
npm install -g hippo-memoryhippo init@xenova/transformers.Highlighted Details
hippo learn --git) and failure monitoring (hippo watch) for continuous improvement.Maintenance & Community
Issues and PRs are welcomed. The README lists several areas for potential contribution, such as improving LongMemEval scores, developing consolidation heuristics, and building a web UI. No specific community links (e.g., Discord, Slack) or prominent maintainer details are provided.
Licensing & Compatibility
MIT License. This license permits commercial use and integration into closed-source projects without significant restrictions.
Limitations & Caveats
While Hippo achieves a 74.0% Recall@5 on the LongMemEval benchmark using only BM25 (zero dependencies), reaching scores comparable to embedding-based systems requires installing the optional @xenova/transformers dependency. Some areas for future development include enhancing consolidation heuristics and creating a web UI.
2 days ago
Inactive