
Ultimate access to all questions.
Answer-first summary for fast verification
Answer: Deploy a custom FluentD daemonset to the cluster that filters out the sensitive information, so it is not logged
The correct approach involves deploying a custom FluentD daemonset to the cluster, which ensures a FluentD pod is present on every node to filter out sensitive data before logging. This method allows for the necessary customization to meet the requirement. Other options fall short because: System & workload logging doesn't permit customization, legacy logging is deprecated and not advised for new clusters, and Kubernetes deployments don't guarantee FluentD pods on all nodes. Reference: Customizing Cloud Logging logs for Google Kubernetes Engine with Fluentd.
Author: LeetQuiz Editorial Team
Imagine you're on a team developing a containerized application for deployment on GKE. The application, which handles sensitive user data, is set to run on a five-node cluster within a single region. There's a critical requirement to ensure sensitive data is excluded from logs before they're sent to Cloud Logging. Which option best fulfills this requirement?
A
Enable Cloud Operations in GKE Select System monitoring only (Logging disabled).
B
Deploy a custom FluentD daemonset to the cluster that filters out the sensitive information, so it is not logged
C
Enable Cloud Operations in GKE Select System and workload logging and monitoring
D
Deploy a custom FluentD deployment to the cluster that filters out the sensitive information, so it is not logged
E
Enable Cloud Operations in GKE Select Legacy logging and monitoring.
No comments yet.