mem-agent-mcp  by firstbatchxyz

Personal memory agent MCP server

Created 2 weeks ago

New!

354 stars

Top 78.8% on SourcePulse

GitHubView on GitHub
Project Summary

This project provides a Model Context Protocol (MCP) server for the driaforall/mem-agent, enabling users to connect their personal memory systems to applications like Claude Desktop and LM Studio. It targets developers and power users seeking to integrate LLMs with structured personal knowledge bases, offering enhanced contextual assistance by leveraging local, private data.

How It Works

The system functions as an MCP server, interfacing with a specialized LLM fine-tuned for memory management. It utilizes local deployment via vLLM or MLX for enhanced privacy and performance. Data is organized in an Obsidian-style Markdown format, featuring a user.md file and an entities/ directory with wikilink navigation for relationships. Various connectors import data from sources like ChatGPT, Notion, GitHub, and Google Docs, transforming it into this structured format, which the MCP server then makes accessible to connected applications.

Quick Start & Requirements

  • Installation: Primarily uses make commands. Key steps include make setup to configure the memory directory and make run-agent to start the agent, allowing selection of model precision (e.g., 4-bit for usability).
  • Prerequisites: Supported platforms include macOS (Metal backend) and Linux (with GPU and vLLM backend). Specific LLM model files are required.
  • Links: An interactive "Memory Wizard" is available via make memory-wizard or python memory_wizard.py for guided setup. Memory connectors can be managed via make connect-memory or python memory_connectors/memory_connect.py.

Highlighted Details

  • Data Connectors: Supports importing data from ChatGPT (exports), Notion (exports), Nuclino (exports), GitHub (via API), and Google Docs (via Drive API).
  • Memory Organization: Data is structured into topics and entity relationships using Markdown files, facilitating efficient retrieval.
  • Query Filtering: Allows users to define filters within queries to control information retrieval and obfuscation.
  • Application Integration: Designed for seamless integration with Claude Desktop, LM Studio, and Claude Code.

Maintenance & Community

The project encourages community contributions, particularly for new connectors and improvements, but does not list specific contributors, sponsorships, or community channels (like Discord/Slack) in the provided documentation.

Licensing & Compatibility

The license type is not explicitly stated in the provided README. The system is designed for local deployment, emphasizing privacy and compatibility with applications that support the MCP protocol.

Limitations & Caveats

The system primarily targets macOS and Linux environments with GPU support. Setup requires careful configuration of memory directories and application integrations. The README does not specify an alpha or beta status, but the focus on local LLM deployment implies potential resource requirements and a need for technical proficiency.

Health Check
Last Commit

3 days ago

Responsiveness

Inactive

Pull Requests (30d)
3
Issues (30d)
1
Star History
371 stars in the last 17 days

Explore Similar Projects

Feedback? Help us improve.