Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
What is the most efficient method to process and load application events from a Pub/Sub topic into BigQuery, ensuring scalability for large volumes and aggregating them hourly?
A
Schedule a batch Dataflow job to run hourly, pulling available messages from the Pub/Sub topic and performing the necessary aggregations
B
Use a Cloud Function triggered by Pub/Sub to perform data processing every time a new message is published.
C
Use a streaming Dataflow job that reads continuously from the Pub/Sub topic and performs the necessary aggregations using tumbling windows.
D
Schedule a Cloud Function to run hourly, pulling available messages from the Pub/Sub topic and performing the necessary aggregations