Serverless app for LLM-powered PDF chat using Amazon Bedrock
Top 93.1% on sourcepulse
This project provides a serverless application for querying PDF documents using natural language, powered by LLMs. It's designed for developers and researchers looking to build AI-powered document analysis tools on AWS. The solution leverages Amazon Bedrock for LLM capabilities, LangChain for orchestration, and other AWS serverless services for a scalable and cost-effective implementation.
How It Works
The application utilizes a serverless architecture where user-uploaded PDFs are processed by AWS Lambda functions. These functions extract text, generate embeddings using Amazon Bedrock's Titan Embeddings, and store these vectors in a FAISS index within S3. When a user queries the document, another Lambda function retrieves relevant document chunks from the FAISS index and passes them, along with conversation history from DynamoDB, to an LLM (like Claude v3 Sonnet) via Amazon Bedrock for response generation.
Quick Start & Requirements
sam build
, sam deploy --guided
).Highlighted Details
Maintenance & Community
This is an aws-samples
repository, indicating official AWS examples. Contributions are welcome via standard GitHub pull requests.
Licensing & Compatibility
Licensed under the MIT-0 License, which permits commercial use and modification without significant restrictions.
Limitations & Caveats
The application is explicitly stated as not production-ready and is intended for demonstration and educational purposes. Security configurations (e.g., API Gateway logging, IAM role scoping) require manual review and adjustment for production deployments. Sensitive data may be logged to CloudWatch by default.
3 months ago
Inactive