
Answer-first summary for fast verification
Answer: Yes
The correct answer is 'Yes' because MimicExplainer supports both global and local feature importance explanations. According to Microsoft documentation and the community discussion consensus, MimicExplainer trains an interpretable surrogate model to mimic the original model's behavior, allowing it to provide both global (overall dataset level) and local (individual prediction level) feature importance. This contrasts with PFIExplainer (Permutation Feature Importance), which only supports global importance and cannot provide local explanations. The community discussion shows confusion due to some comments incorrectly associating MimicExplainer with PFIExplainer's limitations, but the official documentation and higher-upvoted comments confirm that MimicExplainer does support both explanation types.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You train a classification model using logistic regression. You need to explain the model's predictions by calculating the importance of each feature, both as a global relative importance value and as a measure of local importance for a specific prediction set.
You create a MimicExplainer.
Does this solution meet the goal?
A
Yes
B
No
No comments yet.