Open-source AI copilot for observability data, acting as an on-call engineer
Top 80.3% on sourcepulse
Vespper is an open-source AI copilot designed for on-call developers, aiming to streamline incident response by providing real-time, contextual insights and root cause analysis (RCA). It integrates directly into Slack and connects with various observability and incident management tools, offering a self-hostable solution for individual use.
How It Works
Vespper leverages Generative AI to automatically analyze production incidents and alerts. It integrates with tools like Datadog, Coralogix, Opsgenie, and PagerDuty, alongside knowledge sources such as GitHub, Notion, and Confluence, to gather relevant context. The system uses LiteLLM Proxy for unified LLM interaction and ChromaDB for its vector database, enabling conversational querying and automated RCA.
Quick Start & Requirements
docker compose up -d
. Alternatively, pull Docker images and use provided scripts.OPENAI_API_KEY
in config/litellm/.env
and Slack tokens/secrets in the root .env
file.Highlighted Details
vector-admin
.Maintenance & Community
Licensing & Compatibility
vespper-ee
.Limitations & Caveats
The project is primarily suited for individual use, with advanced features planned for a commercial offering (vespper-ee
). Telemetry is enabled by default but can be disabled via .env
.
3 months ago
1 day