memora  by agentic-mcp-tools

Lightweight MCP server for AI agent persistent memory

Created 5 months ago
293 stars

Top 90.4% on SourcePulse

GitHubView on GitHub
Project Summary

Memora provides AI agents with persistent, structured memory, enabling cross-session context and knowledge management. It targets developers building sophisticated AI systems by offering a lightweight MCP server for semantic storage, knowledge graphs, and advanced querying, thereby enhancing agent capabilities and data recall.

How It Works

Memora employs a lightweight MCP server architecture with persistent SQLite storage, optionally syncing to cloud services like S3, R2, or Cloudflare D1. It features hierarchical organization and supports multiple semantic search backends (TF-IDF, sentence-transformers, OpenAI embeddings). Core intelligence includes advanced hybrid search, AI-powered LLM deduplication for merging similar memories, and automatic cross-referencing to build a dynamic knowledge graph. This approach offers robust memory management with flexible deployment and powerful analytical capabilities.

Quick Start & Requirements

  • Install: pip install git+https://github.com/agentic-mcp-tools/memora.git
  • Optional Local Embeddings: pip install "memora[local]" (requires ~2GB disk space for PyTorch models).
  • Prerequisites: Python environment. Cloud storage options (D1, S3/R2) require respective account setup and credentials. OpenAI embeddings/deduplication require an API key.
  • Links: Installation is via pip; detailed configuration examples are provided in the README.

Highlighted Details

  • Storage Flexibility: SQLite local storage with optional cloud sync via S3, R2, or Cloudflare D1.
  • Advanced Search: Supports TF-IDF, sentence-transformers, and OpenAI embeddings for semantic search, combined with full-text, date, and tag filtering.
  • LLM Deduplication: Utilizes AI models to identify and intelligently merge duplicate memories.
  • Interactive Knowledge Graph: Real-time visualization with Mermaid rendering, cluster overlays, and a hosted option for D1 users.
  • Memory Automation: Built-in tools for managing TODOs, issues, and structured sections.
  • Memory Insights: LLM-powered analysis for identifying themes, knowledge gaps, and consolidation opportunities.

Maintenance & Community

Information regarding maintainers, community channels (e.g., Discord, Slack), or project roadmap is not detailed in the provided README.

Licensing & Compatibility

The specific open-source license for Memora is not stated in the provided README content. This omission requires further investigation for commercial use or integration compatibility.

Limitations & Caveats

Manual rebuilding of embeddings and cross-references is necessary after changing embedding models. Cloud storage and LLM features incur external service costs and require configuration. The maturity level of the project (e.g., alpha, beta, stable) is not explicitly mentioned. License details are absent.

Health Check
Last Commit

4 days ago

Responsiveness

Inactive

Pull Requests (30d)
3
Issues (30d)
0
Star History
212 stars in the last 30 days

Explore Similar Projects

Starred by Eric Zhu Eric Zhu(Coauthor of AutoGen; Research Scientist at Microsoft Research) and Chip Huyen Chip Huyen(Author of "AI Engineering", "Designing Machine Learning Systems").

ReMe by agentscope-ai

1.4%
993
LLM chatbot framework for long-term memory
Created 1 year ago
Updated 4 days ago
Feedback? Help us improve.