loading
loading
loading
https://cnfl.io/kafka-python-module-3 | In this lecture, you will learn how to send data to Kafka topics using the Python Producer class. Follow along as Dave Klein (Senior Developer Advocate, Confluent) covers all of this in detail. Use the promo code GOVERNINGSTREAMS101 to get $25 of free Confluent Cloud usage: https://cnfl.io/try-cloud-governing-data-streams Promo code details: https://cnfl.io/governing-streams-101-promo-code-details LEARN MORE ► confluent_kafka.Producer class: https://cnfl.io/confluent-kafka-producer-class-python ABOUT CONFLUENT Confluent is pioneering a fundamentally new category of data infrastructure focused on data in motion. Confluent’s cloud-native offering is the foundational platform for data in motion – designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly stream across the organization. With Confluent, organizations can meet the new business imperative of delivering rich, digital front-end customer experiences and transitioning to sophisticated, real-time, software-driven backend operations. To learn more, please visit www.confluent.io. #python #streamprocessing #apachekafka #kafka #confluent
Get a complete introduction to the Confluent-Kafka Python library, how it works, and how to use it to stream data to Kafka topics with hands-on exercises. Why the confluent-kafka python library? To be successful, services must operate in a real-time manner. Many businesses rely upon the Apache Kafka data streaming framework to achieve this. To do so, their business applications must be able to write to and read from Kafka clusters. If these applications are developed using Python, they need to know how to do this using this programming language. In this course, you'll learn: 1) Why Python for writing real-time applications 2) How to develop Kafka data streaming apps with Python through hands-on exercises 3) How to produce and consume data from Confluent Cloud 4) How to send data to Kafka topics using the Python Producer class 5) How to read data from Kafka topics using the Python Consumer class 6) Integrate applications that use the Python Producer and Consumer classes with the Confluent Schema Registry. 7) (Hands-on exercise): Define a JSON schema and produce data using a Producer, a JSONSerializer, and the Schema Registry