Discover and explore top open-source AI tools and projects—updated daily.
getmetalLLM memory server (deprecated) for chat applications
Top 40.2% on SourcePulse
Motorhead is a deprecated memory and information retrieval server designed to simplify state management for LLM-powered chat applications. It offers three core APIs for managing conversation history, enabling incremental summarization, and performing text-based retrieval using Vector Similarity Search (VSS).
How It Works
Motorhead stores conversation messages in sessions, automatically summarizing older messages to manage context window limits. When the message window reaches a configured maximum (MOTORHEAD_MAX_WINDOW_SIZE), it triggers a summarization process, reducing the token count and maintaining a concise history. For long-term memory, it leverages Redisearch VSS, allowing efficient retrieval of relevant information based on text queries, segmented by session ID.
Quick Start & Requirements
docker-compose build && docker-compose up or docker pull ghcr.io/getmetal/motorhead:latest and run.REDIS_URL environment variable required), OpenAI API key (OPENAI_API_KEY). Azure deployments require additional Azure-specific environment variables.MOTORHEAD_MAX_WINDOW_SIZE (default 12), MOTORHEAD_LONG_TERM_MEMORY (default false), MOTORHEAD_MODEL (default gpt-3.5-turbo), PORT (default 8000), OPENAI_API_BASE (default OpenAI v1).Highlighted Details
Maintenance & Community
The project is marked as DEPRECATED and support is no longer maintained.
Licensing & Compatibility
The license is not specified in the README.
Limitations & Caveats
The project is explicitly marked as DEPRECATED, meaning it will not receive further updates or support, posing a significant risk for adoption. The README does not specify the license, which could impact commercial use or integration into closed-source projects.
3 months ago
1 day
a16z-infra