Discover and explore top open-source AI tools and projects—updated daily.
johnhuang316Intelligent code indexing and analysis for LLMs
Top 63.4% on SourcePulse
This project provides a Model Context Protocol (MCP) server designed to help Large Language Models (LLMs) index, search, and analyze code repositories efficiently. It targets developers and AI engineers seeking to integrate LLMs with their codebases for tasks like code review, refactoring, and documentation generation, offering advanced search and analysis capabilities with minimal setup.
How It Works
The server leverages a file watcher for real-time monitoring and auto-refreshing of code indexes. It intelligently selects and utilizes optimized search tools like ugrep, ripgrep, ag, or grep based on availability, supporting regex and fuzzy matching. The system performs deep file analysis to extract structural information, complexity metrics, and language-specific details, storing this data in a persistent cache for fast retrieval.
Quick Start & Requirements
claude_desktop_config.json or ~/.claude.json) with "command": "uvx", "args": ["code-index-mcp"]. Restart the MCP-compatible application.uv package manager.uv sync, and configure MCP to use uv run code-index-mcp.Highlighted Details
Maintenance & Community
The project is maintained by johnhuang316. Contributions are welcome via pull requests.
Licensing & Compatibility
Licensed under the MIT License, permitting commercial use and integration with closed-source applications.
Limitations & Caveats
The "auto-refresh not working" troubleshooting section suggests potential environment isolation issues with the watchdog library, recommending manual refresh or checking file watcher status as workarounds.
18 hours ago
Inactive