
Answer-first summary for fast verification
Answer: Consume the stream of data in Dataflow using Kafka IO. Set a sliding time window of 1 hour every 5 minutes. Compute the average when the window closes, and send an alert if the average is less than 4000 messages.
The correct answer is A. Using Dataflow with a sliding time window of 1 hour that slides every 5 minutes allows you to effectively calculate the moving average of the messages. This setup enables real-time monitoring and ensures that you can accurately detect when the number of messages drops below the threshold of 4000 per second. The sliding window approach is more suitable for capturing moving averages compared to fixed time windows, which may miss fluctuations that occur in shorter intervals. Options C and D are more complex and unnecessary for this scenario, and option B does not use a moving average approach.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
You are managing an Internet of Things (IoT) pipeline leveraging Apache Kafka, which consistently processes approximately 5000 messages per second. As part of optimizing and ensuring the reliability of your data ingestion framework, you seek to implement an alert system within Google Cloud Platform. The objective is to generate an immediate alert if the moving average of incoming messages per second over a one-hour period falls below 4000. What steps should you take to accomplish this?
A
Consume the stream of data in Dataflow using Kafka IO. Set a sliding time window of 1 hour every 5 minutes. Compute the average when the window closes, and send an alert if the average is less than 4000 messages.
B
Consume the stream of data in Dataflow using Kafka IO. Set a fixed time window of 1 hour. Compute the average when the window closes, and send an alert if the average is less than 4000 messages.
C
Use Kafka Connect to link your Kafka message queue to Pub/Sub. Use a Dataflow template to write your messages from Pub/Sub to Bigtable. Use Cloud Scheduler to run a script every hour that counts the number of rows created in Bigtable in the last hour. If that number falls below 4000, send an alert.
D
Use Kafka Connect to link your Kafka message queue to Pub/Sub. Use a Dataflow template to write your messages from Pub/Sub to BigQuery. Use Cloud Scheduler to run a script every five minutes that counts the number of rows created in BigQuery in the last hour. If that number falls below 4000, send an alert.