
Ultimate access to all questions.
You have recently used TensorFlow to train a classification model on tabular data, leveraging its capabilities to handle large datasets efficiently. You have created a DataFlow pipeline that can transform several terabytes of data into training or prediction datasets consisting of TFRecords. To integrate this model into your regular workflow, you need a solution for productionizing the model such that its predictions are automatically uploaded to a BigQuery table on a weekly schedule. What should you do?
A
Import the model into Vertex AI and deploy it to a Vertex AI endpoint. On Vertex AI Pipelines, create a pipeline that uses the DataFlowPythonJobOp and the ModelBatchPredictOp components.
B
Import the model into Vertex AI and deploy it to a Vertex AI endpoint. Create a DataFlow pipeline that reuses the data processing logic, sends requests to the endpoint, and then uploads predictions to a BigQuery table.
C
Import the model into Vertex AI. On Vertex AI Pipelines, create a pipeline that uses the DataFlowPythonJobOp and the ModelBatchPredictOp components.
D
Import the model into BigQuery. Implement the data processing logic in a SQL query. On Vertex AI Pipelines, create a pipeline that uses the BigQueryQueryJobOp and the BigQueryPredictModelJobOp components.