We're excited to release the AWS AI Stack, a robust, full-stack boilerplate project for building serverless AI applications on AWS. It’s a great fit for those seeking a trusted AWS foundation for AI apps and access to powerful LLM models via Bedrock that keep your app’s data separate from model providers.
The stack is ready-to-deploy (check out the demo here), featuring an AI chat interface with streaming responses on AWS Lambda for real-time conversations, an event-driven architecture for async processing, built-in authentication, a React-based frontend, stage-separated configuration with sane defaults, and CI/CD via Github Actions.
One of this stack’s key features is flexibility in model selection. Developers can easily work with Claude 3.5 Sonnet, Llama3.1, Mistral Large 2, and other models available through AWS Bedrock. This allows for using the model that best fits your project's needs. Further, Bedrock is designed to keep customer data isolated from the model providers. This means that companies like Anthropic, AI21 Labs, or Stability AI don't have direct access to the data used with their models on the Bedrock platform.
Built on a fully serverless architecture, the AWS AI Stack utilizes services like AWS Lambda, API Gateway, DynamoDB, and EventBridge. This design ensures you only pay for resources you use while allowing your application to auto-scale as needed.
We've designed the AWS AI Stack with a domain-oriented architecture. This modular approach allows developers to easily customize, add, or remove logic as needed. Whether you're building a sophisticated chatbot, an AI-powered analytics tool, or exploring new AI applications, AWS AI Stack is designed to be a flexible starting point.
To get started with AWS AI Stack, visit the GitHub repository and check out the live demo at awsaistack.com.