
Answer-first summary for fast verification
Answer: Use local feature importance from the predictions.
The correct answer is A: Use local feature importance from the predictions. In AutoML Tables, local feature importance provides insights into the specific features that contributed to a model's decision for a particular instance. This information is critical in explaining why the model rejected a specific customer's loan request. Unlike global feature importance, which shows feature impact across all predictions, local feature importance focuses on the individual prediction, aligning with the bank's risks department's request for an explanation regarding a specific case. This makes local feature importance the most appropriate choice for this scenario.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You are an ML engineer at a bank where you have implemented a binary classification model using Google Cloud AutoML Tables. The model predicts whether a customer will make loan payments on time, and this prediction is used to approve or reject loan requests. Recently, the model rejected a loan request for a customer, and now the bank's risk department wants to understand the reasons behind this specific decision. What approach should you take to provide an explanation for the model's decision?
A
Use local feature importance from the predictions.
B
Use the correlation with target values in the data summary page.
C
Use the feature importance percentages in the model evaluation page.
D
Vary features independently to identify the threshold per feature that changes the classification.
No comments yet.