MCP server for exposing LLMs-TXT to IDEs
Top 57.7% on sourcepulse
This project provides an open-source Model Context Protocol (MCP) server to expose llms.txt
documentation files to IDEs and AI applications, enabling auditable context retrieval for developers. It addresses the opacity of existing tools by offering fine-grained control over data sources and tool calls.
How It Works
The server acts as a bridge, allowing applications like Cursor, Windsurf, and Claude to query specified llms.txt
files. It implements a fetch_docs
tool that retrieves content from URLs within these files. A key design choice is strict domain access control, where only explicitly allowed domains can be queried, enhancing security and preventing unauthorized data access.
Quick Start & Requirements
uv
: curl -LsSf https://astral.sh/uv/install.sh | sh
uvx --from mcpdoc mcpdoc --urls "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt" --port 8082
npx @modelcontextprotocol/inspector
Highlighted Details
Maintenance & Community
This project is part of the LangChain AI ecosystem. Further community and roadmap details are available via the LangChain community channels.
Licensing & Compatibility
The project is licensed under the MIT License, permitting commercial use and integration with closed-source applications.
Limitations & Caveats
The README notes that as of March 21, 2025, Claude Desktop and Claude Code may not support global rules, requiring manual prompt appending for rule enforcement. Python version compatibility issues might arise with certain integrations, necessitating explicit Python executable path specification.
1 week ago
1 day