Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
As a bank's ML engineer building a solution to provide transparent explanations for AI-driven loan decisions (approvals, credit limits, interest rates) with minimal operational overhead, what is your recommended approach?
A
Deploy the Learning Interpretability Tool (LIT) on App Engine to provide explainability and visualization of the output.
B
Use Vertex Explainable AI to generate feature attributions, and use feature-based explanations for your models.
C
Use AutoML Tables with built-in explainability features, and use Shapley values for explainability.
D
Deploy pre-trained models from TensorFlow Hub to provide explainability using visualization tools.