mem0  by mem0ai

AI agent memory layer for personalized interactions

Created 2 years ago
42,618 stars

Top 0.6% on SourcePulse

GitHubView on GitHub
Project Summary

Mem0 provides an intelligent, multi-level memory layer for AI agents, enhancing personalization and enabling agents to learn user preferences over time. It is designed for developers building AI assistants, customer support bots, and autonomous systems, offering improved accuracy and efficiency compared to full-context memory.

How It Works

Mem0 utilizes a multi-level memory architecture (User, Session, Agent) with adaptive personalization. It integrates with various Large Language Models (LLMs), defaulting to OpenAI's GPT-4o-mini. The system retrieves relevant memories via a search query, incorporates them into the LLM's prompt, and then stores new conversational context. This approach aims to reduce token usage and latency while maintaining high accuracy, as demonstrated by research highlighting +26% accuracy, 91% faster responses, and 90% lower token usage compared to full-context methods on the LOCOMO benchmark.

Quick Start & Requirements

  • Install: pip install mem0ai or npm install mem0ai.
  • Prerequisites: Requires an LLM (OpenAI's GPT-4o-mini is the default). API keys for the chosen LLM are necessary.
  • Resources: No specific hardware requirements are listed beyond standard Python/Node.js environments.
  • Links: Quickstart Guide, API Reference.

Highlighted Details

  • Achieves +26% accuracy, 91% faster responses, and 90% lower token usage than full-context memory on the LOCOMO benchmark.
  • Supports multi-level memory: User, Session, and Agent state.
  • Offers integrations with ChatGPT, Langgraph, and CrewAI, along with a browser extension.
  • Provides both a hosted platform and a self-hosted open-source package.

Maintenance & Community

  • Active community via Discord.
  • Project updates and engagement on Twitter.

Licensing & Compatibility

  • License: Apache 2.0.
  • Compatibility: Permissive license suitable for commercial use and integration into closed-source applications.

Limitations & Caveats

The project is positioned as "Building Production-Ready AI Agents," but specific details on production readiness, scalability limits, or potential failure modes are not elaborated in the README. The default LLM dependency on OpenAI may also be a consideration for users seeking fully independent solutions.

Health Check
Last Commit

18 hours ago

Responsiveness

1 day

Pull Requests (30d)
114
Issues (30d)
66
Star History
1,909 stars in the last 30 days

Explore Similar Projects

Starred by Tobi Lutke Tobi Lutke(Cofounder of Shopify), Chip Huyen Chip Huyen(Author of "AI Engineering", "Designing Machine Learning Systems"), and
9 more.

companion-app by a16z-infra

0.1%
6k
AI companion stack for personalized chatbots
Created 2 years ago
Updated 1 year ago
Feedback? Help us improve.