A modular and comprehensive solution to deploy a Multi-LLM and Multi-RAG powered chatbot (Amazon Bedrock, Anthropic, HuggingFace, OpenAI, Meta, AI21, Cohere, Mistral) using AWS CDK on AWS
-
Updated
Nov 14, 2024 - TypeScript
A modular and comprehensive solution to deploy a Multi-LLM and Multi-RAG powered chatbot (Amazon Bedrock, Anthropic, HuggingFace, OpenAI, Meta, AI21, Cohere, Mistral) using AWS CDK on AWS
This repository features three demos that can be effortlessly integrated into your AWS environment. They serve as a practical guide to leveraging AWS services for crafting a sophisticated Large Language Model (LLM) Generative AI, geared towards creating a responsive Question and Answer Bot and localizing content generation.
An Amazon Kendra REST API CDK example with an API Gateway, including authentication with AWS Cognito and AWS X-Ray Tracing
It shows a question/answering chatbot using Amazon Bedrock with RAG based on Amazon Kendra.
BedrockChat acts as a conversational interface, leveraging generative AI models fine-tuned on your content.
Your personal assistant at work
Use Python to call aws related services in lambda.
Add a description, image, and links to the kendra topic page so that developers can more easily learn about it.
To associate your repository with the kendra topic, visit your repo's landing page and select "manage topics."