
Answer-first summary for fast verification
Answer: Ingest the Avro files into BigQuery to perform analytics. Use a Dataflow pipeline to create the features, and store them in Vertex AI Feature Store for online prediction.
The correct answer is B. Ingest the Avro files into BigQuery to perform analytics. Use a Dataflow pipeline to create the features, and store them in Vertex AI Feature Store for online prediction. BigQuery natively supports ingesting Avro files and is suitable for performing analytics due to its powerful querying capabilities. Dataflow can be used to create features from the ingested data, and Vertex AI Feature Store is specifically designed for managing and serving features with low latency for online predictions. This workflow ensures efficiency and scalability in handling large datasets.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
As a machine learning engineer at a large organization, you are tasked with migrating the company's ML and data workloads to Google Cloud. The data engineering team has provided you with structured data exported to a Cloud Storage bucket in Avro format. Your project requires setting up a workflow that will perform data analytics, create features for the ML model, and host these features for online predictions. How should you configure the pipeline?
A
Ingest the Avro files into Cloud Spanner to perform analytics. Use a Dataflow pipeline to create the features, and store them in Vertex AI Feature Store for online prediction.
B
Ingest the Avro files into BigQuery to perform analytics. Use a Dataflow pipeline to create the features, and store them in Vertex AI Feature Store for online prediction.
C
Ingest the Avro files into Cloud Spanner to perform analytics. Use a Dataflow pipeline to create the features, and store them in BigQuery for online prediction.
D
Ingest the Avro files into BigQuery to perform analytics. Use BigQuery SQL to create features and store them in a separate BigQuery table for online prediction.