MCP server for up-to-date code documentation in AI code editors
Top 1.8% on sourcepulse
Context7 MCP Server provides up-to-date, version-specific code documentation and examples directly into AI coding assistants, addressing the common LLM issues of outdated information, non-existent APIs, and generic answers. It targets developers using AI code editors like Cursor, Windsurf, and Claude Desktop, enabling them to receive accurate, context-aware code suggestions without manual research.
How It Works
Context7 MCP acts as a Model Context Protocol (MCP) server, fetching current documentation and code snippets from source repositories. When a user prompts their AI assistant to "use context7," the MCP server is invoked. It resolves library identifiers, retrieves relevant documentation for specified topics and versions, and injects this information into the LLM's context window, ensuring responses are based on the latest available information.
Quick Start & Requirements
npx -y @upstash/context7-mcp@latest
(or via bunx
, deno
, Docker, or specific editor configurations).Highlighted Details
npx
, bunx
, deno
, and Docker.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The README includes a disclaimer stating that Context7 projects are community-contributed, and while efforts are made to maintain quality, accuracy and completeness of documentation cannot be guaranteed. Users acknowledge they use it at their own discretion and risk.
1 day ago
1 day