
Answer-first summary for fast verification
Answer: Create an aggregated log sink in the Dev and Prod folders.
The requirement is to export logs from development and production projects to separate BigQuery datasets (`dev_dataset` and `prod_dataset`) while minimizing the number of log sinks and ensuring automatic inclusion of future projects. - Option A (single organization-level sink) fails because a single sink can only export to one BigQuery dataset, making it impossible to split logs by folder/dataset. - Option B (sink per project) violates the minimization requirement, as each new project requires manual sink creation and configuration. - Option C (two organization-level sinks filtered by project ID) is inefficient because filters based on explicit project IDs must be updated manually for every new project, failing the future-project requirement. - Option D (aggregated log sinks in the Dev and Prod folders) meets all criteria: - Two sinks (minimum possible for two datasets). - Aggregated sinks at the folder level automatically include all current and future projects in the folder hierarchy. - Each sink can be configured with a destination dataset (e.g., Dev folder sink → `dev_dataset`, Prod folder sink → `prod_dataset`). - Folder-level sinks inherit to subfolders and projects, ensuring scalability without manual updates.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You need to create Cloud Logging sinks to export log entries from development projects in the Dev folder to dev_dataset and from production projects in the Prod folder to prod_dataset. The solution must minimize the number of sinks while ensuring they apply to future projects. What is the best approach?
A
Create a single aggregated log sink at the organization level.
B
Create a log sink in each project.
C
Create two aggregated log sinks at the organization level, and filter by project ID.
D
Create an aggregated log sink in the Dev and Prod folders.
No comments yet.