
Answer-first summary for fast verification
Answer: Create a Vertex AI Workbench user-managed notebook using the default VM instance, and use the %%bigquery magic commands in Jupyter to query the tables.
The correct answer is A. Creating a Vertex AI Workbench user-managed notebook using the default VM instance and using the %%bigquery magic commands in Jupyter to query the tables is the best approach to minimize both cost and development effort. Default VM instances are cost-effective, and the %%bigquery magic commands make it simple to access and query the BigQuery tables without additional complexity. Other options, such as managed notebooks and Dataproc clusters, introduce additional costs and complexities that are unnecessary for basic EDA, preprocessing, and model training tasks.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You are developing a recommendation engine for an online clothing store that leverages historical customer transaction data stored in BigQuery and Cloud Storage. Your tasks include performing exploratory data analysis (EDA), preprocessing the data, and training machine learning models. You will need to rerun these steps multiple times as you experiment with different algorithms to find the best performing model. Given that you aim to minimize both cost and development effort during these experiments, how should you configure the environment to achieve this balance?
A
Create a Vertex AI Workbench user-managed notebook using the default VM instance, and use the %%bigquery magic commands in Jupyter to query the tables.
B
Create a Vertex AI Workbench managed notebook to browse and query the tables directly from the JupyterLab interface.
C
Create a Vertex AI Workbench user-managed notebook on a Dataproc Hub, and use the %%bigquery magic commands in Jupyter to query the tables.
D
Create a Vertex AI Workbench managed notebook on a Dataproc cluster, and use the spark-bigquery-connector to access the tables.
No comments yet.