
Ultimate access to all questions.
You are operating an IoT pipeline using Apache Kafka, which averages 5000 messages per second. Your goal is to configure an alert on Google Cloud Platform that triggers when the 1-hour moving average drops below 4000 messages per second. How would you set up this alert on Google Cloud Platform?
A
Link your Kafka message queue to Pub/Sub using Kafka Connect. Utilize a Dataflow template to transfer messages from Pub/Sub to BigQuery. Schedule a script via Cloud Scheduler to run every five minutes, counting the rows added to BigQuery in the last hour. If the count is below 4000, trigger an alert.
B
Process the data stream in Dataflow with Kafka IO. Implement a sliding time window of 1 hour, updating every 5 minutes. Calculate the average upon window closure, and issue an alert if the average falls below 4000 messages.
C
Connect your Kafka message queue to Pub/Sub with Kafka Connect. Use a Dataflow template to move messages from Pub/Sub to Bigtable. Have Cloud Scheduler execute a script hourly to tally the rows inserted into Bigtable in the past hour. If the tally is under 4000, send an alert.
D
Ingest the data stream in Dataflow via Kafka IO. Apply a fixed 1-hour time window. Determine the average at the window's end, and alert if the average is less than 4000 messages.