
Ultimate access to all questions.
As a professional working for a public transportation company, you are tasked with building a predictive model to estimate delay times for various routes. These predictions will be delivered to users in real-time via an app, and the model requires monthly retraining to account for seasonal and population changes. The solution must be cost-effective, scalable, and comply with data privacy regulations. According to Google's recommended best practices, which of the following options provides the BEST end-to-end architecture for this model? Choose the best option.
A
Use Cloud Composer to orchestrate a Dataflow job that processes data, trains the model on AI Platform, and deploys it for real-time predictions, ensuring scalability and cost-efficiency.
B
Configure Kubeflow Pipelines to manage the multi-step workflow from data preprocessing, model training on AI Platform, to deployment, facilitating easy retraining and adherence to best practices.
C
Develop a Cloud Functions script that triggers a training and deployment pipeline on AI Platform via Cloud Scheduler, suitable for lightweight and event-driven retraining scenarios.
D
Implement a solution using BigQuery ML for training and deploying the model, leveraging its programmed query feature for monthly retraining, ideal for teams with strong SQL expertise.