
Answer-first summary for fast verification
Answer: Create a Pub/Sub topic to export monitoring data, use Dataflow to process the data, and then use BigQuery sink to store the data in BigQuery.
Creating a Pub/Sub topic to export monitoring data allows for real-time streaming and decoupling of the data ingestion process. Using Dataflow to process the data provides a scalable, serverless solution for data transformation, and the BigQuery sink enables efficient storage of the data in BigQuery. This approach is the most appropriate, as it offers a scalable, cost-effective, and real-time solution to integrate Cloud Monitoring with BigQuery. Other options either add unnecessary complexity and latency, are not scalable, or are not currently feasible.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
A company has deployed a multi-tier web application on Google Cloud Platform (GCP) and wants to use Cloud Monitoring to analyze application performance data. They have decided to integrate Cloud Monitoring with BigQuery to perform more complex analysis on the collected metrics. Which of the following approaches is the most appropriate way to achieve this integration while ensuring a scalable, cost-effective solution?
A
Enable the BigQuery export feature within Cloud Monitoring, which will export the monitoring data directly into BigQuery tables.
B
Use the Cloud Monitoring API to fetch the metric data and then use the BigQuery Streaming API to insert the data into BigQuery tables in real-time.
C
Create a Pub/Sub topic to export monitoring data, use Dataflow to process the data, and then use BigQuery sink to store the data in BigQuery.
D
Export the monitoring data to a Cloud Storage bucket, then set up a Data Transfer service to move the data to BigQuery.