loading
loading
loading
In part 6 of Generative AI Foundations on AWS, AWS generative AI expert instructor Emily Webber will teach you about reinforcement learning with human feedback (RLHF) and why it is important. You'll learn how it solves problems with subjectivity and objectivity through ranked human preference at scale before diving into how it works. Dive deep into original GPT, labelling from humans, reward models, and how to leverage reward model signal to update a new GPT model. You'll then learn how to do this with SageMaker GroundTruth before going hands-on with RLHF on SageMaker using this workshop: https://go.aws/3rAB2Of Learn more about generative AI on AWS: https://go.aws/3rw39hF Tune in to Build On Generative AI with host Emily Webber on twitch.tv/aws for even more tips and tricks: https://m.twitch.tv/videos/1723458659 Access the slides from this lesson to follow along: https://github.com/aws-samples/sagemaker-distributed-training-workshop/blob/main/slides/Generative%20AI%20Foundations%20Technical%20Deep%20Dive/1%20-%20Intro%20to%20FMs.pdf.zip Subscribe: More AWS videos: https://go.aws/3m5yEMW More AWS events videos: https://go.aws/3ZHq4BK Do you have technical AWS questions? Ask the community of experts on AWS re:Post: https://go.aws/3lPaoPb ABOUT AWS Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform, offering over 200 fully featured services from data centers globally. Millions of customers — including the fastest-growing startups, largest enterprises, and leading government agencies — are using AWS to lower costs, become more agile, and innovate faster. #AWSMachineLearning #MachineLearningUniversity #MLU #LearnML #LearnMachineLearning #aiml #GenerativeAI #GPT #AWS #AmazonWebServices #CloudComputing
Generative AI Foundations on AWS is a technical deep dive course that gives you the conceptual fundamentals, practical advice, and hands-on guidance to pre-train, fine-tune, and deploy state-of-the-art foundation models on AWS and beyond. Developed by AWS Generative AI Worldwide Foundations Lead Emily Webber, this free hands-on course and the supporting GitHub source code. If you are looking for a curated playlist of the top resources, concepts, and guidance to get up to speed on foundation models, and especially those that unlock generative capabilities in your data science and machine learning projects, then look no further. During this 8-hour deep dive, you will be introduced to the key techniques, services, and trends that will help you understand foundation models from the ground up. This means breaking down theory, mathematics, and abstract concepts combined with hands-on exercises to gain functional intuition for practical application. Throughout the course, we focus on a wide spectrum of progressively complex generative AI techniques, giving you a strong base to understand, design, and apply your own models for the best performance. We’ll start with recapping foundation models, understanding where they come from, how they work, how they relate to generative AI, and what you can to do customize them. You’ll then learn about picking the right foundation model to suit your use case. Once you’ve developed a strong contextual understanding of foundation models and how to use them, you’ll be introduced to the core subject of this course: pre-training new foundation models. You’ll learn why you’d want to do this as well as how and where it’s competitive. You’ll even learn how to use the scaling laws to pick the right model, dataset, and compute sizes. We’ll cover preparing training datasets at scale on AWS, including picking the right instances and storage techniques. We’ll cover fine-tuning your foundation models, evaluating recent techniques, and understanding how to run these with your scripts and models. We’ll dive into reinforcement learning with human feedback, exploring how to use it skillfully and at scale to truly maximize your foundation model performance. Finally, you’ll learn how to apply theory to production by deploying your new foundation model on Amazon SageMaker, including across multiple GPUs and using top design patterns like retrieval augmented generation and chained dialogue. As an added bonus, we’ll walk you through a Stable Diffusion deep dive, prompt engineering best practices, standing up LangChain, and more. Learn more about generative AI on AWS: https://aws.amazon.com/generative-ai/ Get even more tips tricks and guidance in the Build on Generative AI Twitch series with host Emily Webber: https://m.twitch.tv/videos/1723458659 Learn about Amazon Bedrock; The easiest way to build and scale generative AI applications with foundation models (FMs): https://aws.amazon.com/bedrock/