
Ultimate access to all questions.
You're overseeing an Apache Kafka-based IoT pipeline that processes 5000 messages per second. Your task is to create an alert that triggers when the one-hour moving average drops below 4000 messages per second. Which Google Cloud Platform strategy should you implement?
A
Link Kafka to Pub/Sub using Kafka Connect, utilize a Dataflow template to write messages to BigQuery, employ Cloud Scheduler to tally the rows added to BigQuery in the past hour, and trigger an alert if the count is under 4000.
B
Stream the data in Dataflow via Kafka IO, apply a fixed one-hour time window, calculate the average upon window closure, and issue an alert if the average falls below 4000 messages per second.
C
Stream the data in Dataflow via Kafka IO, implement a sliding one-hour time window with a 5-minute interval, compute the average at each window's end, and alert if the average is less than 4000 messages per second.
D
Connect Kafka to Pub/Sub with Kafka Connect, use a Dataflow template to store messages in Bigtable, schedule Cloud Scheduler to count Bigtable rows from the last hour, and send an alert if the count is below 4000.