OpenMemory  by CaviraOSS

AI memory engine for persistent, explainable recall

Created 2 weeks ago

New!

1,403 stars

Top 28.9% on SourcePulse

GitHubView on GitHub
Project Summary

Summary

OpenMemory is an open-source, self-hosted AI memory engine designed to add persistent, structured, and explainable long-term memory to LLM applications. It targets developers building AI agents, assistants, and copilots who require secure, efficient, and framework-agnostic memory solutions. OpenMemory offers significant advantages over traditional vector databases and SaaS memory layers, providing faster recall, lower latency, and reduced costs through its novel Hierarchical Memory Decomposition (HMD) architecture.

How It Works

OpenMemory employs a Hierarchical Memory Decomposition (HMD) architecture, differentiating itself from flat embedding approaches. It utilizes multi-sector embeddings (episodic, semantic, procedural, emotional, reflective) and a single-waypoint linking mechanism within a biologically-inspired graph. This design enables composite similarity retrieval, where recall is enhanced by sector fusion and activation spreading. This approach results in better recall accuracy, reduced latency, and explainable reasoning paths, all while maintaining data ownership and offering cost efficiencies.

Quick Start & Requirements

  • Primary Install: Manual setup involves cloning the repository, copying .env.example to .env, running npm install in the backend directory, and starting the server with npx tsx src/server.ts. Docker setup uses docker compose up --build -d.
  • Prerequisites: Node.js 20+, SQLite 3.40+ (bundled). Optional: Ollama, OpenAI, or Gemini for embeddings.
  • Configuration: Environment variables in .env control port, database path, embedding provider, and various memory parameters.
  • Links: GitHub repository: https://github.com/caviraoss/openmemory.git. Discord server link available in README.

Highlighted Details

  • Achieves 2–3x faster contextual recall and 6–10x lower cost compared to hosted alternatives like Zep or Supermemory.
  • Supports multiple embedding models, including local options like Ollama, E5, and BGE, alongside OpenAI and Gemini.
  • Offers explainable recall paths and full data ownership, with optional AES-GCM content encryption for enhanced privacy.
  • Provides built-in support for LangGraph integration and the Model Context Protocol (MCP) for seamless integration with various AI runtimes.

Maintenance & Community

The project is actively developed, with v1.2 (Dashboard + metrics) in progress and future plans for learned sector classifiers (v1.3) and federated multi-node modes (v1.4). Notable contributors include Morven, Muhammad Fiaz, Peter Chung, Brett Ammeson, and Joseph Goksu. A Discord server is available for community engagement.

Licensing & Compatibility

OpenMemory is released under the MIT License, permitting commercial use and integration into closed-source projects without copyleft restrictions.

Limitations & Caveats

The current implementation focuses on single-node deployment; federated multi-node capabilities are planned for future releases (v1.4). Advanced features like learned sector classification are still under development (v1.3). While SQLite is bundled, performance at extreme scales might necessitate exploring pluggable backends mentioned in the roadmap (v1.1).

Health Check
Last Commit

1 day ago

Responsiveness

Inactive

Pull Requests (30d)
18
Issues (30d)
11
Star History
1,425 stars in the last 16 days

Explore Similar Projects

Feedback? Help us improve.