Ensure Consistency in Streaming Pipelines
Master the design and implementation of consistent streaming data pipelines using Apache Kafka, Spark, and Flink. In this hands-on course, you'll apply systematic decision frameworks to select appropriate delivery guarantees (at-most-once, at-least-once, exactly-once) based on business requirements and failure scenario analysis. You'll implement end-to-end exactly-once processing by configuring Kafka producer transactions, Spark Structured Streaming checkpoints, and Hudi transactional tables, then validate your implementation through integration testing with failure injection. Finally, you'll …
Watch on Coursera ↗
(saves to browser)
DeepCamp AI