
Answer-first summary for fast verification
Answer: Create a Stackdriver Logging Export with a Sink destination to a BigQuery dataset. Configure the table expiration to 60 days.
The correct answer is B. By creating a Stackdriver Logging Export with a Sink destination to a BigQuery dataset and configuring the table expiration to 60 days, you ensure that you can explore and quickly analyze the log contents for the past 60 days. This approach aligns with Google-recommended practices and allows for efficient log data management and analysis in BigQuery. Options A, C, and D do not provide the same level of flexibility and efficiency for analyzing a large volume of logs over the specified time period.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You are responsible for handling multiple Google Cloud Platform (GCP) projects and you require access to the logs generated over the last 60 days for all these projects. It's crucial that you have the capability to efficiently explore and analyze these log entries. To achieve this, you aim to adhere to Google's best practices for aggregating and managing logs from multiple projects. What steps should you take to obtain and analyze these combined logs in accordance with Google's recommendations?
A
Navigate to Stackdriver Logging and select resource.labels.project_id="*"
B
Create a Stackdriver Logging Export with a Sink destination to a BigQuery dataset. Configure the table expiration to 60 days.
C
Create a Stackdriver Logging Export with a Sink destination to Cloud Storage. Create a lifecycle rule to delete objects after 60 days.
D
Configure a Cloud Scheduler job to read from Stackdriver and store the logs in BigQuery. Configure the table expiration to 60 days.
No comments yet.