MemoryScope  by modelscope

LLM chatbot framework for long-term memory

created 11 months ago
506 stars

Top 62.4% on sourcepulse

GitHubView on GitHub
Project Summary

MemoryScope equips LLM chatbots with a robust long-term memory system, enabling applications like personal assistants and emotional companions to learn user preferences and information over time. It provides a flexible framework for building and managing this memory, enhancing user experience by creating a sense of "understanding" from the LLM.

How It Works

MemoryScope employs a modular architecture with a vector database (defaulting to Elasticsearch) for storing memory fragments. Its core functionality is atomized into over 20 specialized "workers" that handle tasks like filtering information, extracting observations, and updating insights. These workers are orchestrated into pipelines for memory retrieval (finding semantically related or time-specific memories) and memory consolidation (extracting and storing key user information from conversations). A "reflection and re-consolidation" process periodically updates insights and resolves contradictions or repetitions in memory.

Quick Start & Requirements

Highlighted Details

  • Achieves low response times (~500ms) by decoupling backend memory operations from frontend retrieval.
  • Features hierarchical and coherent memory, with insights aggregating similar observations and mechanisms to handle contradictions, repetitions, and filter fictitious content.
  • Incorporates time awareness for both memory retrieval and consolidation, ensuring accurate information recall based on temporal context.
  • Supports multiple LLM backends (OpenAI, DashScope) for generation, embedding, and reranking tasks.

Maintenance & Community

  • Actively maintained with recent releases (v0.1.1.0 as of Sept 2024).
  • Contributions are encouraged, with pre-commit hooks recommended for pull requests.
  • Citation available for academic use.

Licensing & Compatibility

  • License not explicitly stated in the README. Compatibility for commercial use or closed-source linking is undetermined.

Limitations & Caveats

The README does not specify the project's license, which is a critical factor for commercial adoption or integration into closed-source projects. Future support for local LLM and embedding services is planned but not yet implemented.

Health Check
Last commit

5 months ago

Responsiveness

1 day

Pull Requests (30d)
1
Issues (30d)
1
Star History
48 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.