AI gateway for LLM production use cases
Top 35.9% on sourcepulse
BricksLLM provides an enterprise-grade API gateway for managing Large Language Models (LLMs) in production. It offers fine-grained access control, cost, and rate limiting per API key, targeting developers and organizations needing to control and monitor LLM usage. The gateway supports major LLM providers like OpenAI, Azure OpenAI, Anthropic, and vLLM, along with custom deployments.
How It Works
BricksLLM acts as a central proxy, routing requests to configured LLM providers. It intercepts requests to enforce policies such as rate limits, cost controls, and access restrictions based on API keys and associated tags. The system leverages PostgreSQL for storing configuration and Redis for caching and session management, ensuring efficient policy enforcement and analytics.
Quick Start & Requirements
docker compose up
within the BricksLLM-Docker
repository.curl
commands or SDK integration.Highlighted Details
Maintenance & Community
The project is actively maintained by bricks-cloud. Community channels are not explicitly mentioned in the README.
Licensing & Compatibility
The README does not specify a license. Compatibility for commercial use or closed-source linking is not detailed.
Limitations & Caveats
The project is presented as production-ready with a managed version available, but the open-source version's specific stability and support guarantees are not detailed. The README does not mention specific versioning beyond a latest
tag for Docker images.
6 months ago
1 day