
Answer-first summary for fast verification
Answer: Do daily exports of Cloud Logging data to BigQuery. Create views filtering by project, log type, resource, and user.
The correct answer is **C**. Exporting Cloud Logging data directly to BigQuery allows for efficient storage and analysis in a scalable and cost-effective manner. Creating views in BigQuery enables filtering by specific criteria (project, log type, resource, and user), which is crucial for generating the requested daily reports. BigQuery's powerful querying capabilities facilitate quick processing of large datasets, offering insights into resource consumption and usage. **Why the other options are incorrect:** - **A**: Manually filtering and exporting data in CSV format is inefficient for daily reports and doesn't scale well with large datasets. CSV format also limits advanced analytics capabilities. - **B**: This method adds unnecessary complexity by involving Cloud Storage and Dataprep for data cleansing, making the process less straightforward compared to direct BigQuery exports. - **D**: Manually filtering and importing data into BigQuery is time-consuming and less efficient than directly exporting data to BigQuery.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
Your new customer requires daily reports to monitor their net consumption of Google Cloud compute resources and to identify the users utilizing these resources. What is the most efficient and prompt method to generate these daily reports?
A
Filter data in Cloud Logging by project, resource, and user; then export the data in CSV format.
B
Export Cloud Logging data to Cloud Storage in CSV format. Cleanse the data using Dataprep, filtering by project, resource, and user.
C
Do daily exports of Cloud Logging data to BigQuery. Create views filtering by project, log type, resource, and user.
D
Filter data in Cloud Logging by project, log type, resource, and user, then import the data into BigQuery.
No comments yet.