
Answer-first summary for fast verification
Answer: Use a streaming Dataflow job that reads continuously from the Pub/Sub topic and performs the necessary aggregations using tumbling windows.
Correct. A streaming Dataflow job is the most efficient method for processing and loading application events from a Pub/Sub topic into BigQuery, especially for large volumes of data. It allows for continuous reading from the topic and performs necessary aggregations using tumbling windows, ensuring scalability and timely data processing. Reference: [Google Cloud Dataflow Documentation on Streaming Pipelines](https://cloud.google.com/dataflow/docs/concepts/streaming-pipelines#tumbling-windows).
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
What is the most efficient method to process and load application events from a Pub/Sub topic into BigQuery, ensuring scalability for large volumes and aggregating them hourly?
A
Schedule a batch Dataflow job to run hourly, pulling available messages from the Pub/Sub topic and performing the necessary aggregations
B
Use a Cloud Function triggered by Pub/Sub to perform data processing every time a new message is published.
C
Use a streaming Dataflow job that reads continuously from the Pub/Sub topic and performs the necessary aggregations using tumbling windows.
D
Schedule a Cloud Function to run hourly, pulling available messages from the Pub/Sub topic and performing the necessary aggregations
No comments yet.