sample-OpenClaw-on-AWS-with-Bedrock  by aws-samples

Cloud-native AI assistant deployment with multi-model flexibility

Created 4 weeks ago

New!

357 stars

Top 78.7% on SourcePulse

GitHubView on GitHub
Project Summary

This project provides an AWS-native deployment for OpenClaw, an open-source personal AI assistant, leveraging Amazon Bedrock to unify LLM access and eliminate the need for managing multiple API keys. It targets users seeking an enterprise-ready, secure, and cost-effective solution, offering significant benefits through automated deployment, flexible model selection, and optimized hardware choices.

How It Works

The solution automates the deployment of OpenClaw on AWS using CloudFormation, enabling a one-click setup. It integrates with Amazon Bedrock, allowing seamless switching between multiple LLM providers (like Anthropic Claude, Amazon Nova, DeepSeek, Llama) via a unified API and IAM roles for authentication, enhancing security and flexibility. The architecture prioritizes cost-efficiency and performance by defaulting to Graviton ARM processors, which offer superior price-performance compared to x86 instances. Security is further bolstered by using SSM Session Manager for access and VPC Endpoints for private network communication with Bedrock.

Quick Start & Requirements

  • Primary install / run command: One-click deployment via CloudFormation by clicking the "Launch Stack" buttons provided for various AWS regions and instance types.
  • Non-default prerequisites and dependencies: An AWS account with Bedrock models enabled in the desired region, and an EC2 key pair created in that region. The SSM Session Manager Plugin is required for local access.
  • Estimated setup time or resource footprint: Approximately 8 minutes for deployment.
  • Links: CloudFormation launch links are provided within the README. Official OpenClaw Documentation: https://docs.clawd.bot/

Highlighted Details

  • Multi-Model Flexibility: Supports 8 LLM models (e.g., Nova, Claude, DeepSeek, Llama, Kimi) with smart routing capabilities, allowing model switching without code changes or redeployment.
  • Graviton Advantage: Recommended ARM-based EC2 instances (t4g, c7g) provide 20-40% better price-performance than x86 alternatives, with options ranging from small instances for cost-effectiveness to compute-optimized types.
  • Enterprise Security & Compliance: Eliminates API key management through IAM roles, provides comprehensive audit trails via CloudTrail, and secures traffic using VPC Endpoints. Access is managed via SSM Session Manager, avoiding public ports.
  • One-Click Deployment: CloudFormation automates the creation of VPC, IAM roles, EC2 instances, and Bedrock integration, ensuring reproducible and version-controlled deployments.

Maintenance & Community

The project highlights that "90% of this project's code was generated through conversations with Kiro AI." Specific details on maintainers, sponsorships, or dedicated community channels (like Discord/Slack) are not provided in the README. Links to official OpenClaw documentation and GitHub issues are available.

Licensing & Compatibility

The deployment template is provided "as-is." The underlying Clawdbot/OpenClaw software is licensed under its original license, which is not specified here. The solution is designed for AWS and is compatible with various AWS services.

Limitations & Caveats

macOS instances for iOS/macOS development are significantly expensive ($468-792/month) and have a 24-hour minimum allocation, making them impractical for general OpenClaw use. Microsoft Teams integration requires additional configuration beyond the scope of the quick start guide. Disabling VPC Endpoints can save ~$22/month but reduces network security.

Health Check
Last Commit

1 day ago

Responsiveness

Inactive

Pull Requests (30d)
14
Issues (30d)
14
Star History
363 stars in the last 28 days

Explore Similar Projects

Feedback? Help us improve.