Full-stack boilerplate for serverless AI apps on AWS
Top 38.4% on sourcepulse
This repository provides a full-stack, serverless boilerplate for building AI applications on AWS, targeting developers who want a robust, cost-effective foundation for LLM-powered applications. It offers a complete architecture including backend APIs, event-driven services, a database, and a frontend, all designed to leverage AWS Bedrock for AI capabilities while ensuring data privacy.
How It Works
The stack utilizes AWS services like Lambda, API Gateway, DynamoDB, and EventBridge for a scalable, pay-as-you-go architecture. It features an Express.js backend for APIs and business logic, with a separate service for AI chat powered by AWS Bedrock, supporting multiple LLM models. Streaming responses are handled via Lambda Function URLs for the AI chat service, while other APIs use API Gateway. Serverless Compose manages configuration and deployment across services.
Quick Start & Requirements
npm install
(in root), then serverless deploy
npm i -g serverless
), Node.js. AWS Bedrock models (e.g., meta.llama3-70b-instruct-v1:0
) must be enabled in the AWS console.Highlighted Details
Maintenance & Community
The project is maintained by serverless. The README does not explicitly mention community channels like Discord or Slack, nor does it list specific contributors or sponsorships.
Licensing & Compatibility
The README does not specify a license. This lack of explicit licensing means commercial use and closed-source linking are not clearly permitted.
Limitations & Caveats
The project is presented as a boilerplate and may require significant configuration for production, including custom domain names and secure secret management. Hosting the static website via Lambda is noted as not recommended for production. The README also mentions potential issues with enabling AWS Bedrock models for new or low-spend AWS accounts.
8 months ago
1 day