
Answer-first summary for fast verification
Answer: Export the model to BigQuery ML.
Option A, 'Export the model to BigQuery ML,' is the correct answer. BigQuery ML allows you to import TensorFlow models and perform predictions directly within the BigQuery environment. This approach minimizes computational overhead as it leverages BigQuery's infrastructure for executing predictions, eliminating the need to move data between different services or manage additional compute resources. Deploying the model on AI Platform or using Dataflow introduces extra complexity and computational requirements, which are not necessary in this context.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
After training a text classification model in TensorFlow using AI Platform, you need to perform batch predictions on a large dataset of text data that is stored in BigQuery. Your goal is to minimize computational overhead during this process. What is the most efficient approach?
A
Export the model to BigQuery ML.
B
Deploy and version the model on AI Platform.
C
Use Dataflow with the SavedModel to read the data from BigQuery.
D
Submit a batch prediction job on AI Platform that points to the model location in Cloud Storage.