
Answer-first summary for fast verification
Answer: Process the data stream in Dataflow with Kafka IO. Implement a sliding time window of 1 hour, updating every 5 minutes. Calculate the average upon window closure, and issue an alert if the average falls below 4000 messages.
**Correct Answer: B** - **Option B** is correct because it employs Dataflow with Kafka IO to consume the data stream efficiently. The sliding time window of 1 hour, updated every 5 minutes, enables near-real-time monitoring of the moving average. Calculating the average upon window closure ensures the alert is based on current data, meeting the requirement to alert when the average dips below 4000 messages per second. **Why Other Options Are Incorrect:** - **Option A** involves unnecessary complexity by routing messages through BigQuery and counting rows, which doesn't directly monitor the Kafka message rate. - **Option C** similarly complicates the process by using Bigtable for row counting, detracting from the direct monitoring of message rates. - **Option D** uses a fixed time window, lacking the granularity for timely detection of rate drops, unlike the sliding window in Option B.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
You are operating an IoT pipeline using Apache Kafka, which averages 5000 messages per second. Your goal is to configure an alert on Google Cloud Platform that triggers when the 1-hour moving average drops below 4000 messages per second. How would you set up this alert on Google Cloud Platform?
A
Link your Kafka message queue to Pub/Sub using Kafka Connect. Utilize a Dataflow template to transfer messages from Pub/Sub to BigQuery. Schedule a script via Cloud Scheduler to run every five minutes, counting the rows added to BigQuery in the last hour. If the count is below 4000, trigger an alert.
B
Process the data stream in Dataflow with Kafka IO. Implement a sliding time window of 1 hour, updating every 5 minutes. Calculate the average upon window closure, and issue an alert if the average falls below 4000 messages.
C
Connect your Kafka message queue to Pub/Sub with Kafka Connect. Use a Dataflow template to move messages from Pub/Sub to Bigtable. Have Cloud Scheduler execute a script hourly to tally the rows inserted into Bigtable in the past hour. If the tally is under 4000, send an alert.
D
Ingest the data stream in Dataflow via Kafka IO. Apply a fixed 1-hour time window. Determine the average at the window's end, and alert if the average is less than 4000 messages.