Discover and explore top open-source AI tools and projects—updated daily.
Sentry middleware for LLM interaction
Top 79.6% on SourcePulse
This project provides a middleware communication protocol (MCP) server for interacting with the Sentry API, designed to facilitate AI-driven analysis and automation. It targets developers and researchers who need to integrate Sentry data with large language models (LLMs) for tasks like automated code review and issue analysis. The primary benefit is enabling natural language queries and commands against Sentry data.
How It Works
The MCP server acts as a bridge between LLMs and the Sentry API. It leverages Cloudflare's MCP work, allowing for both remote and stdio transports. AI-powered search tools translate natural language queries into Sentry's query syntax, enhancing data exploration. This approach simplifies complex API interactions and enables advanced AI capabilities for Sentry users.
Quick Start & Requirements
npx @sentry/mcp-server@latest --access-token=sentry-user-token --host=sentry.example.com
org:read
, project:read
, project:write
, team:read
, team:write
, event:write
scopes. OpenAI API key is required for AI-powered search tools (search_events
, search_issues
).Highlighted Details
Maintenance & Community
The project is part of the Sentry ecosystem. Further community and contribution details are available via the deployed service link.
Licensing & Compatibility
The README does not explicitly state the license. Compatibility for commercial use or closed-source linking is not specified.
Limitations & Caveats
AI-powered search tools are dependent on an OpenAI API key. The stdio transport is noted as a work in progress. Automated code review tools are provided as suggestions, not mandatory checks.
4 days ago
1 day