
Answer-first summary for fast verification
Answer: Send the data to Google Cloud Pub/Sub, stream Cloud Pub/Sub to Google Cloud Dataflow, and store the data in Google BigQuery.
Option B is the correct answer. Using Google Cloud Pub/Sub allows for efficient ingestion and real-time data streaming. Google Cloud Dataflow can process and transform the streaming data in real-time, while Google BigQuery is a fully managed, highly scalable data warehouse suited for real-time analysis and querying of large datasets. This combination makes it the most suitable solution for processing, storing, and analyzing very large datasets in real time.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You are about to deploy 10,000 new Internet of Things (IoT) devices across multiple warehouses around the world. These devices will continuously collect temperature data, resulting in a significant volume of data that needs to be processed, stored, and analyzed in real-time. Considering the scale and the need for real-time insights, how should you approach the processing, storage, and analysis of these large datasets?
A
Send the data to Google Cloud Datastore and then export to BigQuery.
B
Send the data to Google Cloud Pub/Sub, stream Cloud Pub/Sub to Google Cloud Dataflow, and store the data in Google BigQuery.
C
Send the data to Cloud Storage and then spin up an Apache Hadoop cluster as needed in Google Cloud Dataproc whenever analysis is required.
D
Export logs in batch to Google Cloud Storage and then spin up a Google Cloud SQL instance, import the data from Cloud Storage, and run an analysis as needed.
No comments yet.