Open-source LLMOps platform for streamlining AI workflows
Top 16.3% on sourcepulse
Pezzo is an open-source, cloud-native LLMOps platform designed for developers to manage prompts, enhance observability, and streamline AI operations. It aims to reduce costs and latency while facilitating collaboration and rapid deployment of AI changes.
How It Works
Pezzo provides a unified platform for prompt management, versioning, and delivery. It leverages a cloud-native architecture with dependencies on PostgreSQL, ClickHouse, Redis, and Supertokens for data storage and management. This approach allows for seamless integration and scalability, enabling real-time monitoring, troubleshooting, and efficient deployment of AI models and prompts.
Quick Start & Requirements
npm install
.docker-compose -f docker-compose.infra.yaml up
. Deploy migrations with npx dotenv-cli -e apps/server/.env -- npx prisma migrate deploy
. Run the server with npx nx serve server
and the console with npx nx serve console
.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The README indicates support for Node.js and Python clients, with other client integrations being community-driven via issue requests. The setup involves multiple steps and infrastructure dependencies managed via Docker Compose.
1 month ago
1 day