CLI tool for injecting project context into LLM chats
Top 99.7% on sourcepulse
This tool helps developers inject relevant code and text content into Large Language Model (LLM) chats, enabling AI to understand project context for tasks like code review and documentation. It targets developers working with LLMs, particularly those using interfaces with persistent context like Claude Desktop or Custom GPTs, offering a streamlined workflow for sharing project information.
How It Works
LLM Context utilizes .gitignore
patterns for intelligent file selection and supports direct integration with LLMs via the Model Context Protocol (MCP) or a clipboard-based CLI workflow. It employs a rule-based system, now configured via Markdown files with YAML front matter, allowing customization for different tasks and enabling features like smart code outlining and definition implementation extraction.
Quick Start & Requirements
uv tool install "llm-context>=0.3.0"
claude_desktop_json
is needed.Highlighted Details
Maintenance & Community
The project is actively developed, with a recent breaking change in v0.3.0 introducing a new Markdown-based rules system. The primary contributor is @restlessronin.
Licensing & Compatibility
Licensed under the Apache License, Version 2.0. This license is permissive and generally compatible with commercial use and closed-source linking.
Limitations & Caveats
The project is under active development, and updates may overwrite configuration files. Definition implementation extraction does not support C/C++.
8 hours ago
1 week