
Ultimate access to all questions.
As an ML engineer at a bank, you've developed a binary classification model using Google Cloud AutoML Tables to predict whether customers will make loan payments on time, which directly influences loan approval decisions. A customer's loan request was recently rejected, and the bank's risk management department has requested a detailed explanation of the model's decision-making process for this specific case. The bank emphasizes the importance of transparency and accountability in its AI-driven decisions, especially given regulatory compliance requirements. You need to provide a clear, actionable explanation that can be understood by non-technical stakeholders. Which of the following approaches will best meet these requirements? (Choose two options if E is available.)
A
Review the global feature importance percentages on the model evaluation page to explain the general behavior of the model.
B
Use the data summary page to analyze the correlation between features and the target variable, then infer the reasons for the specific rejection.
C
Investigate local feature importance for the specific prediction to understand how each feature influenced the decision for this particular customer.
D
Manually adjust the features of the rejected application to identify which changes would reverse the model's decision, without considering the model's interpretability features.
E
Combine insights from both global feature importance to understand the model's general behavior and local feature importance for the specific prediction to provide a comprehensive explanation.