
Answer-first summary for fast verification
Answer: No
The question requires an explainer that can provide both global feature importance (overall model behavior) and local feature importance (for specific predictions). PFIExplainer (Permutation Feature Importance) only supports global feature importance by measuring how much model performance decreases when each feature is randomly shuffled. It does not support local/instance-level explanations. Community discussion confirms this with high consensus (multiple comments with upvotes stating PFI only supports global importance, not local), and official documentation/sample notebooks indicate PFIExplainer cannot provide local explanations. Since the solution requires both global and local importance, PFIExplainer does not meet the goal.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You train a classification model using logistic regression. You need to explain the model's predictions by calculating the importance of each feature, both as a global relative importance value and as a local importance measure for specific predictions.
You create a PFIExplainer.
Does this solution meet the goal?
A
Yes
B
No
No comments yet.