
Answer-first summary for fast verification
Answer: Use BigQuery as the data warehouse. Connect all vehicles to the network and stream data into BigQuery using Cloud Pub/Sub and Cloud Dataflow. Use Google Data Studio for analysis and reporting.
Option A is the correct answer because it provides a real-time data streaming architecture using managed services that aligns with the TerramEarth case study requirements. Cloud Pub/Sub and Cloud Dataflow enable continuous ingestion of vehicle sensor data, allowing for immediate analysis in BigQuery to detect potential failures before they cause unplanned downtime. Google Data Studio then provides accessible reporting for maintenance teams. This approach is supported by the community consensus (100% of votes for A, with 41 upvotes on the top comment) and Google's recommended architecture for connected vehicles. Option B uses batch uploads which delay insights, Option C adds unnecessary complexity with Dataproc Hive, and Option D relies on outdated Hadoop tools instead of modern serverless analytics.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
Based on the TerramEarth case study and its technical requirements, what is the recommended approach to minimize unplanned vehicle downtime using Google Cloud Platform?
A
Use BigQuery as the data warehouse. Connect all vehicles to the network and stream data into BigQuery using Cloud Pub/Sub and Cloud Dataflow. Use Google Data Studio for analysis and reporting.
B
Use BigQuery as the data warehouse. Connect all vehicles to the network and upload gzip files to a Multi-Regional Cloud Storage bucket using gcloud. Use Google Data Studio for analysis and reporting.
C
Use Cloud Dataproc Hive as the data warehouse. Upload gzip files to a Multi-Regional Cloud Storage bucket. Upload this data into BigQuery using gcloud. Use Google Data Studio for analysis and reporting.
D
Use Cloud Dataproc Hive as the data warehouse. Directly stream data into partitioned Hive tables. Use Pig scripts to analyze data.