memory-lancedb-pro  by win4r

Advanced memory management for AI agents

Created 1 week ago

New!

1,045 stars

Top 35.9% on SourcePulse

GitHubView on GitHub
Project Summary

Summary

memory-lancedb-pro enhances the OpenClaw AI agent framework by providing advanced long-term memory capabilities. It targets developers needing sophisticated context management, offering hybrid retrieval (vector + BM25), cross-encoder reranking, multi-scope isolation, and a management CLI. This plugin significantly boosts agent recall accuracy and contextual relevance beyond basic vector stores.

How It Works

This OpenClaw plugin integrates LanceDB for persistent memory storage, supporting both vector embeddings and BM25 full-text indexing. Its core innovation lies in a multi-stage retrieval pipeline: queries are processed via hybrid vector/BM25 search, fused using a boosted Reciprocal Rank Fusion (RRF) strategy, and then reranked by configurable cross-encoders (e.g., Jina, SiliconFlow). Further refinements include recency boosts, time decay, length normalization, and MMR diversity, ensuring relevant and non-redundant memory injection. Adaptive retrieval and noise filtering optimize query processing.

Quick Start & Requirements

Installation involves cloning the repository into the OpenClaw workspace's plugins/ directory (e.g., workspace/plugins/memory-lancedb-pro) and running npm install. Alternatively, clone elsewhere and specify an absolute path in openclaw.json. Prerequisites include an OpenClaw environment, Node.js, and an OpenAI-compatible embedding API key (e.g., Jina, OpenAI, Gemini, Ollama). Configuration requires setting up embedding models and optionally reranker endpoints. A video tutorial is available at https://youtu.be/MtukF1C8epQ.

Highlighted Details

  • Hybrid retrieval combining vector (semantic) and BM25 (keyword) search with configurable fusion weights.
  • Cross-encoder reranking for improved relevance, with fallback mechanisms and support for multiple providers.
  • Multi-stage scoring pipeline incorporating recency, time decay, importance, length normalization, and diversity (MMR).
  • Granular multi-scope isolation (global, agent, custom, project, user) for access control.
  • Adaptive retrieval intelligently skips non-memory queries, while noise filtering removes low-quality captures.
  • A comprehensive Management CLI for direct memory manipulation (list, search, delete, export/import, reembed).
  • Broad compatibility with various OpenAI-compatible embedding providers, including local Ollama instances.

Maintenance & Community

The README does not detail specific maintainers, sponsorships, or community channels like Discord/Slack. Primary community interaction points are likely the GitHub repository itself and associated OpenClaw channels.

Licensing & Compatibility

The project is released under the permissive MIT license, allowing for commercial use and integration into closed-source applications without significant restrictions.

Limitations & Caveats

This plugin requires an active OpenClaw installation and configuration of external embedding/reranking services. Session memory is disabled by default. The extensive configuration options, while powerful, may present a learning curve for users seeking basic memory functionality.

Health Check
Last Commit

22 hours ago

Responsiveness

Inactive

Pull Requests (30d)
27
Issues (30d)
30
Star History
1,075 stars in the last 9 days

Explore Similar Projects

Starred by Eric Zhu Eric Zhu(Coauthor of AutoGen; Research Scientist at Microsoft Research) and Chip Huyen Chip Huyen(Author of "AI Engineering", "Designing Machine Learning Systems").

ReMe by agentscope-ai

61.6%
2k
LLM chatbot framework for long-term memory
Created 1 year ago
Updated 21 hours ago
Feedback? Help us improve.