💻 Welcome to the "Serverless LLM apps with Amazon Bedrock" course! Instructed by Mike Chambers, Developer Advocate for Generative AI at AWS, this course will teach you how to deploy Large Language Model (LLM)-based applications into production using serverless technology with Amazon Bedrock.
Course Website: 📚deeplearning.ai
In this course, you'll learn the ins and outs of deploying LLM-based applications using serverless technology. Here's what you can expect to learn and experience:
- 🛠 Prompting and Customizing LLM Responses: Learn how to prompt and customize your LLM responses using Amazon Bedrock.
- 🔊 Summarizing Audio Conversations: Summarize audio conversations by transcribing audio files and passing the transcription to an LLM.
- ⚙️ Deploying Event-driven Audio Summarizer: Deploy an event-driven audio summarizer that runs as new audio files are uploaded using a serverless architecture.
- 🧠 Learn how to prompt and customize your LLM responses using Amazon Bedrock.
- 🎙 Summarize audio conversations by transcribing audio files and passing the transcription to an LLM.
- ⚡ Deploy an event-driven audio summarizer using a serverless architecture.
🌟 Mike Chambers is a Developer Advocate for Generative AI at AWS and co-instructor of Generative AI with Large Language Models. With extensive experience, Mike will guide you through deploying serverless LLM applications with Amazon Bedrock.
🔗 To enroll in the course or for further information, visit deeplearning.ai.