Discover and explore top open-source AI tools and projects—updated daily.
Lyellr88AI collaborator with persistent, evolving memory
Top 100.0% on SourcePulse
MARM MCP addresses AI's inherent lack of persistent memory by providing a unified, memory-powered collaborator. It acts as a universal MCP server, enabling AI agents to learn, remember, and share context across sessions and tools, enhancing productivity for developers and researchers.
How It Works
The system utilizes the MARM protocol with a FastAPI backend and SQLite for persistent storage, integrating Sentence Transformers for semantic search and auto-classification. This architecture allows multiple AI agents to share knowledge seamlessly, ensuring long-term recall and session continuity beyond stateless LLMs. It supports HTTP, STDIO, and experimental WebSocket transports.
Quick Start & Requirements
docker pull lyellr88/marm-mcp-server:latest then docker run ...) or pip (pip install marm-mcp-server).INSTALL-DOCKER.md, INSTALL-WINDOWS.md).Highlighted Details
Maintenance & Community
The project is actively maintained with planned Q1 2026 improvements in semantic search and session handling. Community engagement is fostered through MARM Discord and GitHub Discussions.
Licensing & Compatibility
Limitations & Caveats
WebSocket transport is in beta. Local development authentication is mock-based and not production-ready; multi-user OAuth is a future enhancement. As a project in its early roadmap phase, users should expect ongoing development and potential changes.
1 month ago
Inactive