Ultimate access to all questions.
You are tasked with implementing a batch inference machine learning pipeline using Google Cloud services. The machine learning model, developed using TensorFlow, is stored in the SavedModel format in Cloud Storage. Your objective is to apply this model to a historical dataset, which contains 10 TB of data stored in a BigQuery table. Considering the large size of the dataset and the need for efficient processing, how should you perform the inference?