
Answer-first summary for fast verification
Answer: 1. Use a dual-region Cloud Storage bucket with turbo replication enabled. 2. Monitor Dataflow metrics with Cloud Monitoring to determine when an outage occurs. 3. Seek the subscription back in time by 60 minutes to recover the acknowledged messages. 4. Start the Dataflow job in a secondary region.
Option D is the correct answer. This option recommends using a dual-region Cloud Storage bucket with turbo replication enabled, which ensures data replication within a 15-minute RPO. This setup aligns with the requirement of preventing data loss in case of a regional outage with an RPO of 15 minutes. Additionally, monitoring Dataflow metrics with Cloud Monitoring will help identify outages, and seeking the subscription back in time by 60 minutes ensures recovery of acknowledged messages. Starting the Dataflow job in a secondary region further ensures continuity.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
You are responsible for designing an Apache Beam processing pipeline that reads data from a Google Cloud Pub/Sub topic. The Pub/Sub topic has a message retention duration of one day. The pipeline writes the processed data to a Cloud Storage bucket. Your objective is to choose a bucket location and a processing strategy that can ensure data is not lost in the event of a regional outage, meeting a Recovery Point Objective (RPO) of 15 minutes. What actions should you take?
A
B
C
D