LLM chatbot framework for long-term memory
Top 62.4% on sourcepulse
MemoryScope equips LLM chatbots with a robust long-term memory system, enabling applications like personal assistants and emotional companions to learn user preferences and information over time. It provides a flexible framework for building and managing this memory, enhancing user experience by creating a sense of "understanding" from the LLM.
How It Works
MemoryScope employs a modular architecture with a vector database (defaulting to Elasticsearch) for storing memory fragments. Its core functionality is atomized into over 20 specialized "workers" that handle tasks like filtering information, extracting observations, and updating insights. These workers are orchestrated into pipelines for memory retrieval (finding semantically related or time-specific memories) and memory consolidation (extracting and storing key user information from conversations). A "reflection and re-consolidation" process periodically updates insights and resolves contradictions or repetitions in memory.
Quick Start & Requirements
pip install -e .
pre-commit
for development.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The README does not specify the project's license, which is a critical factor for commercial adoption or integration into closed-source projects. Future support for local LLM and embedding services is planned but not yet implemented.
5 months ago
1 day