
Answer-first summary for fast verification
Answer: Use Explainable AI.
The question asks for a Google AI Platform solution to help HRL understand and interpret ML model predictions while improving accuracy. Option A (Use Explainable AI) is the correct choice because Explainable AI (XAI) is specifically designed to provide insights into model predictions by showing feature contributions, helping identify bias, verify model behavior, and guide improvements. The community discussion strongly supports this, with 100% consensus and high upvotes (e.g., 32 upvotes for a comment detailing XAI's purpose). While some comments note that AI Explanations is deprecated and replaced by Vertex AI, the core functionality aligns with Explainable AI principles. Other options are unsuitable: B (Vision AI) is for computer vision tasks, not general model interpretation; C (Google Cloud's operations suite) focuses on monitoring and troubleshooting infrastructure, not model predictions; D (Jupyter Notebooks) is a development tool, not a dedicated interpretation service.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
You are working with the Helicopter Racing League (HRL) case study. HRL wants to improve the prediction accuracy of their ML models and use Google's AI Platform to make the predictions understandable and interpretable. What should you do?
A
Use Explainable AI.
B
Use Vision AI.
C
Use Google Cloud's operations suite.
D
Use Jupyter Notebooks.