
Ultimate access to all questions.
You are operating a car factory that transmits machine measurement data as messages to a Pub/Sub topic within your Google Cloud project. You have developed a Dataflow streaming job using the Apache Beam SDK to read these messages, acknowledge their receipt to Pub/Sub, apply custom business logic within a DoFn instance, and subsequently write the processed results to BigQuery. Your objective is to ensure that any message failing due to business logic errors is redirected to a separate Pub/Sub topic dedicated to monitoring and alerting. How should you accomplish this?
A
Enable retaining of acknowledged messages in your Pub/Sub pull subscription. Use Cloud Monitoring to monitor the subscription/num_retained_acked_messages metric on this subscription._
B
Use an exception handling block in your Dataflow’s DoFn code to push the messages that failed to be transformed through a side output and to a new Pub/Sub topic. Use Cloud Monitoring to monitor the topic/num_unacked_messages_by_region metric on this new topic.
C
Enable dead lettering in your Pub/Sub pull subscription, and specify a new Pub/Sub topic as the dead letter topic. Use Cloud Monitoring to monitor the subscription/dead_letter_message_count metric on your pull subscription._
D
Create a snapshot of your Pub/Sub pull subscription. Use Cloud Monitoring to monitor the snapshot/num_messages metric on this snapshot._