
Answer-first summary for fast verification
Answer: Monitor the training/serving skew of feature values for requests sent to the endpoint., Enable Vertex Explainable AI feature attribution to analyze model predictions and understand the impact of each feature on the model's predictions.
The question describes a scenario where model performance (AUC) degrades significantly in production after several weeks, indicating potential data drift or concept drift. Option A is correct because monitoring training/serving skew directly addresses data drift, where the statistical properties of production data diverge from the training data, causing performance drops. This is supported by community consensus (67% chose A) and detailed reasoning that Vertex AI Model Monitoring is designed for this purpose. Option C is also correct because enabling Vertex Explainable AI helps diagnose feature-level issues by analyzing feature attributions, revealing if certain features have become less predictive or if their relationships with the target variable have changed over time. Options B and D are less suitable: B (resource utilization) and D (latency) focus on infrastructure performance but do not directly diagnose model performance degradation due to data drift, which is the core issue here.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You have deployed a customer churn prediction model to a Vertex AI endpoint. After several weeks, the model's performance, as measured by the AUC, has significantly declined compared to its training evaluation. What steps should you take to diagnose this issue?
A
Monitor the training/serving skew of feature values for requests sent to the endpoint.
B
Monitor the resource utilization of the endpoint, such as CPU and memory usage, to identify potential bottlenecks in performance.
C
Enable Vertex Explainable AI feature attribution to analyze model predictions and understand the impact of each feature on the model's predictions.
D
Monitor the latency of the endpoint to determine whether predictions are being served within the expected time frame.
No comments yet.