Stream & Optimize Real-Time Data Flows

Coursera Courses ↗ · Coursera

Open Course on Coursera

Free to audit · Opens on Coursera

Stream & Optimize Real-Time Data Flows

Coursera · Intermediate ·🏗️ Systems Design & Architecture ·1mo ago
Master the design, implementation, and optimization of production-ready streaming data pipelines using Apache Kafka and Flink. This intermediate-level course teaches you to evaluate log configurations against governance requirements (PCI-DSS, GDPR, SOC2) and cost constraints, design stream processing topologies that join and aggregate data in real time with exactly-once semantics, and optimize pipelines through partition tuning, compression, and cost modeling. You'll work through hands-on labs that mirror real-world scenarios at DoorDash, Netflix, and Robinhood: comparing retention policies against compliance rules, building a Kafka Streams application that joins orders and payments to calculate 5-minute revenue totals, and diagnosing performance bottlenecks to meet SLAs within budget. Intermediate data engineers and platform engineers who build or operate real-time streaming systems and want to master Kafka/Flink governance, joins, windowing, and cost-optimized scaling. Understanding of distributed systems, basic Apache Kafka knowledge, familiarity with SQL and streaming concepts, Python or Java programming experience. By the end, you'll design and optimize a multi-tenant streaming platform with governance controls—skills directly applicable to streaming data engineer, real-time platform engineer, and data infrastructure roles.
Watch on Coursera ↗ (saves to browser)
Sign in to unlock AI tutor explanation · ⚡30

Related AI Lessons

Up next
One database, four systems replaced, 73% lower cost ✨ #AzureCosmosDBConf
Microsoft Developer
Watch →