
Answer-first summary for fast verification
Answer: Use the Kubeflow Pipelines (KFP) SDK to create multiple components that use Dataflow and Vertex AI services. Deploy the workflow on Vertex AI Pipelines.
The correct answer is C. Kubeflow Pipelines (KFP) is a popular open-source framework specifically designed for building and deploying machine learning workflows. It provides a user-friendly SDK for defining pipelines as components and simplifies workflow orchestration. Vertex AI Pipelines provides a managed service from Google Cloud that integrates seamlessly with KFP, thus leveraging features like scheduling, monitoring, and versioning. This makes it ideal for creating a low-maintenance, automated workflow as required in the question.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You are a machine learning engineer tasked with building a TensorFlow text-to-image generative model. Your dataset contains billions of images along with their respective captions. You aim to design a low-maintenance, automated workflow, which must: (1) Read data from a Cloud Storage bucket, (2) Collect statistics, (3) Split the dataset into training, validation, and test datasets, (4) Perform necessary data transformations, (5) Train the model using the training and validation datasets, and (6) Validate the model using the test dataset. Given these requirements, which approach should you choose?
A
Use the Apache Airflow SDK to create multiple operators that use Dataflow and Vertex AI services. Deploy the workflow on Cloud Composer.
B
Use the MLFlow SDK and deploy it on a Google Kubernetes Engine cluster. Create multiple components that use Dataflow and Vertex AI services.
C
Use the Kubeflow Pipelines (KFP) SDK to create multiple components that use Dataflow and Vertex AI services. Deploy the workflow on Vertex AI Pipelines.
D
Use the TensorFlow Extended (TFX) SDK to create multiple components that use Dataflow and Vertex AI services. Deploy the workflow on Vertex AI Pipelines.
No comments yet.