AWS AI Stack is a ready-to-use, full-stack boilerplate for building serverless AI applications on AWS. This project provides a solid AWS foundation, allowing developers to integrate AI models via AWS Bedrock while ensuring data privacy and separation from model providers.
API Gateway V2
AWS Lambda (Serverless functions)
AWS EventBridge (Event-driven architecture)
AWS DynamoDB (NoSQL Database)
AWS Bedrock (AI models)
Vanilla React application
Fully serverless AI chatbot architecture
Streaming responses for real-time AI interactions on AWS Lambda
Integrates multiple AI models via AWS Bedrock (Claude 3.5 Sonnet, Llama 3.1, Mistral Large 2, etc.)
Ensures data privacy—your app data never leaves AWS
Auto-scaling without paying for idle time
Pay-per-use model (additional costs may apply for DynamoDB and AWS Bedrock-trained models)
API Gateway services configured with serverless-domain-manager plugin
Lambda services configured with CloudFront Distributions
📡 API & Event-Driven Architecture
Express.js API for custom business logic
Shared EventBridge for publishing & subscribing to events
Worker service for processing events asynchronously
API Gateway authorizer for secured endpoints
User login & registration using Lambda (Express.js) + DynamoDB
JWT token authentication for session management
Shared configuration for all services
Separated configurations for different environments (dev, prod, etc.)
Modular components for easy customization (remove AI Chat, authentication, etc., if not needed)
Deploy services to production via GitHub Actions
Deploy PR environments and auto-remove services after merge
git clone https://github.com/your-username/aws-ai-stack.git cd aws-ai-stack
npm install
aws configure
npx serverless deploy
cd frontend npm start
This project is open-source and available under the MIT License.