
Answer-first summary for fast verification
Answer: Stream data to Pub/Sub, and use Dataflow to send data to Cloud Storage.
The correct answer is A. Streaming data to Pub/Sub allows you to decouple the ingestion of data from the processing and storage, providing a scalable and reliable message queue that can handle the high volume of data coming from millions of sensors. Using Dataflow to consume data from Pub/Sub and send it to Cloud Storage allows for real-time data processing and storage. Cloud Storage is a scalable object storage service that offers high durability and availability, making it well-suited for handling large volumes of structured and unstructured data from IoT sensors. This combination of services ensures a highly available and resilient architecture, adhering to Google-recommended practices.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
You are tasked with designing a data lake on Google Cloud for an Internet of Things (IoT) application that involves millions of sensors continuously streaming both structured and unstructured data to your cloud backend. The goal is to implement an architecture that ensures high availability and resilience, adhering strictly to Google's recommended practices. What steps should you take to accomplish this?
A
Stream data to Pub/Sub, and use Dataflow to send data to Cloud Storage.
B
Stream data to Pub/Sub, and use Storage Transfer Service to send data to BigQuery.
C
Stream data to Dataflow, and use Dataprep by Trifacta to send data to Bigtable.
D
Stream data to Dataflow, and use Storage Transfer Service to send data to BigQuery.