
Answer-first summary for fast verification
Answer: Configure feature-based explanations by using Integrated Gradients. Set visualization type to PIXELS, and set clip_percent_upperbound to 95.
The correct answer is A. Configuring feature-based explanations using Integrated Gradients with visualization type set to PIXELS and clip_percent_upperbound set to 95 is an appropriate approach. This method allows for per-pixel attribution, helping to identify specific regions of the image that influence the model's decision. By filtering out noise and focusing on areas of strong attribution, you can gain insights into why some images are consistently mislabeled, which is crucial for understanding and improving the model's performance.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You work for a manufacturing company focused on producing high-quality products. At the end of the assembly line, you need to ensure that a custom image classification model accurately detects product defects. Your current model has been trained, and although it performs well overall, some images in your holdout set are consistently mislabeled with high confidence, which could lead to poor quality control. To improve the model and understand these mislabelings, you plan to use Vertex AI. What approach should you take to analyze and address these issues?
A
Configure feature-based explanations by using Integrated Gradients. Set visualization type to PIXELS, and set clip_percent_upperbound to 95.
B
Create an index by using Vertex AI Matching Engine. Query the index with your mislabeled images.
C
Configure feature-based explanations by using XRAI. Set visualization type to OUTLINES, and set polarity to positive.
D
Configure example-based explanations. Specify the embedding output layer to be used for the latent space representation.
No comments yet.