Second brain for LLMs, providing contextual knowledge
Top 5.0% on sourcepulse
Supermemory provides a scalable, self-hostable memory API designed to augment Large Language Models (LLMs) with contextual knowledge. It targets developers and individuals seeking to build AI-powered applications or enhance personal knowledge management by transforming scattered data into actionable insights. The core benefit is enabling LLMs with relevant, easily searchable personal or organizational data.
How It Works
Supermemory acts as a universal engine for LLM memory, focusing on data organization and search. It ingests data from various sources (web pages, bookmarks, contacts, tweets, Notion) and stores it in a searchable format, leveraging Postgres with Pgvector for efficient similarity search. The architecture comprises a web UI (Remix, Hono), a Chrome extension for easy data capture, and a backend powered by Cloudflare Workers, utilizing R2 for object storage and Cloudflare KV for caching. This stack prioritizes speed, scalability, and self-hostability.
Quick Start & Requirements
SELF-HOSTING-GUIDE.md
. A Chrome extension is available on the Chrome Web Store.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The CC BY-NC-SA 4.0 license restricts commercial use without explicit permission, potentially limiting adoption in enterprise environments. The project is primarily developed by a single individual, which may impact long-term maintenance and development velocity.
2 months ago
Inactive