Imagine building systems that react instantly to streams of data—fraud alerts triggered within milliseconds, dashboards pulsing with live updates, and digital platforms scaling to millions of users in real time. In this hands-on course on Apache Kafka, you'll learn the basics of these capabilities to life. Through a dynamic mix of real-world examples, interactive labs, and practical demonstrations, you’ll progress from the basics of event streaming to deploying advanced, production-ready architectures.
Foundations of Event Streaming
Begin with a solid understanding of event streaming concepts and event-driven architecture, exploring how Apache Kafka powers real-time data flows in industries like finance and beyond. You’ll examine critical use cases, deploy your own Kafka cluster and user interface using Docker, and create your first Kafka topic.
Building Blocks of Kafka
Dive into Kafka’s core architecture, including brokers, topics, partitions, and replication, learning how these elements work together to store, organize, and reliably deliver streaming data.
Kafka Producers & Consumers: The Message Flow
Understand how producers and consumers form the backbone of Kafka's event pipeline. Learn message serialization, the significance of message keys, and reliability strategies like acknowledgments and consumer groups. Reinforce concepts by building and configuring producers and consumers, exploring advanced operations like rebalancing.
Deep Dive into Kafka: Beyond the Basics
Advance your skills by looking at Kafka’s offset management, error handling (including poison pill scenarios), and the roles of ZooKeeper and KRaft for cluster management.
Confluent Kafka and its Offerings
Explore the operational complexities of running Kafka at scale, and discover how Confluent Cloud simplifies management and deployment for cloud-native event streaming.
Kafka Connect: Effortless Data Pipelines
Learn how Kafka Connect enables seamless streaming of data to and from external systems. Build your own pipeline to stream Kafka data to Amazon S3, gaining practical insights into connector configuration and deployment.
Building an Event-Driven System
Design and implement a complete event-driven architecture utilizing Kafka, from infrastructure setup on AWS EC2 to frontend and backend integration. Consolidate your skills with a capstone lab, bringing together all course concepts for real-world application in an interactive environment.
Target Audience:
This course is ideal for software developers, DevOps engineers, data engineers, and technical leads eager to design, deploy, and manage real-time, event-driven systems using industry-leading technologies.
By the end of this course, you’ll be fully equipped to leverage Kafka and related tools for real-time data streaming, unlocking new levels of innovation and efficiency for your projects and organization.
Raghunandana Krishnamurthy is a seasoned Staff Data Engineer and MLOps expert, skilled in navigating both GCP and AWS cloud platforms to accelerate model development and deployment. His experience spans modernizing legacy data systems, architecting hybrid infrastructures, and ensuring data quality for diverse applications. He used to hold Associate AWS Solution Architect certification, Cloudera Hadoop Admin certification, Airflow certification, and Databricks Lakehouse certification
A technical leader and passionate trainer, Raghunandana excels at building and maintaining big data platforms, championing DevOps best practices, and fostering team alignment. With hands-on expertise in tools like SageMaker, VertexAI, Prometheus, Grafana, and extensive DevOps tools focusing on Data Engineering and MLOps.