
Ultimate access to all questions.
You are tasked with implementing a batch inference machine learning pipeline using Google Cloud services. The machine learning model, developed using TensorFlow, is stored in the SavedModel format in Cloud Storage. Your objective is to apply this model to a historical dataset, which contains 10 TB of data stored in a BigQuery table. Considering the large size of the dataset and the need for efficient processing, how should you perform the inference?
A
Export the historical data to Cloud Storage in Avro format. Configure a Vertex AI batch prediction job to generate predictions for the exported data.
B
Import the TensorFlow model by using the CREATE MODEL statement in BigQuery ML. Apply the historical data to the TensorFlow model.
C
Export the historical data to Cloud Storage in CSV format. Configure a Vertex AI batch prediction job to generate predictions for the exported data.
D
Configure a Vertex AI batch prediction job to apply the model to the historical data in BigQuery.