
Ultimate access to all questions.
As a magazine distributor, you've deployed a TensorFlow model on Google Cloud's AI Platform to predict customer subscription renewals for the next year. Your model is trained on historical data, including various customer attributes. To enhance model interpretability and understand which customer attribute most significantly influences each prediction, you're considering several approaches. Given the need for accuracy and the ability to explain predictions to stakeholders, which of the following methods should you employ? (Choose one correct option.)
A
Use the What-If Tool in Google Cloud to evaluate model performance by systematically excluding each feature and observing the impact on prediction accuracy. This method helps in ranking features based on their contribution to model performance.
B
Conduct a Lasso regression analysis using AI Platform notebooks to identify features with minimal predictive power, potentially removing them to simplify the model.
C
Implement the AI Explanations feature on AI Platform by specifying the 'explain' parameter in your prediction requests. This approach utilizes the Shapley sampling method to provide detailed feature attributions for each prediction.
D
Export the prediction outputs to BigQuery and employ the CORR(X1, X2) function to calculate the Pearson correlation coefficient between each feature and the target variable, aiming to identify the most influential attributes.