Cloud-native API and AI gateway for microservice orchestration
Top 0.7% on sourcepulse
Kong API Gateway is a high-performance, cloud-native API gateway designed for managing microservices and API traffic, with added AI capabilities for multi-LLM support. It targets developers and organizations seeking a centralized, scalable solution for API orchestration, offering features like advanced routing, authentication, and extensibility through plugins.
How It Works
Kong acts as a reverse proxy, handling incoming API requests and routing them to appropriate backend services. Its architecture is plugin-based, allowing for dynamic extension of functionality. It supports declarative configuration and offers both database-backed and database-less modes for deployment flexibility. The AI Gateway features enable integration with multiple LLMs, prompt engineering, and AI observability.
Quick Start & Requirements
docker-compose
:
git clone https://github.com/Kong/docker-kong
cd docker-kong/compose/
KONG_DATABASE=postgres docker-compose --profile database up
Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The README focuses on getting started with Docker Compose; other deployment methods may have different prerequisites. While extensibility is a core feature, developing custom plugins requires familiarity with Lua, Go, or JavaScript.
2 days ago
1 day