
Answer-first summary for fast verification
Answer: Schedule a weekly query in BigQuery to compute the success metric.
Option C is the best answer. Scheduling a weekly query in BigQuery is a cost-effective and straightforward way to monitor the model's performance since all the necessary data to compute the success metric is already available and updated hourly in BigQuery. By conducting a weekly check, you can determine if the model’s performance has declined below the acceptable baseline and decide if retraining is necessary, avoiding unnecessary retraining and additional monitoring infrastructure costs.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You work for an online publisher that delivers news articles to over 50 million readers. As part of your role, you have built an AI model to recommend content for the company’s weekly newsletter. The success of a recommendation is determined if the recommended article is opened within two days of the newsletter’s publication date and the user spends at least one minute on the page. The data required to calculate this success metric is stored in BigQuery and is updated on an hourly basis. The AI model, trained on eight weeks of data, typically sees its performance drop below an acceptable baseline after five weeks, and retraining the model takes approximately 12 hours. Your goal is to ensure the model maintains performance above the acceptable baseline while also minimizing costs. How should you monitor the model to determine when it needs to be retrained?
A
Use Vertex AI Model Monitoring to detect skew of the input features with a sample rate of 100% and a monitoring frequency of two days.
B
Schedule a cron job in Cloud Tasks to retrain the model every week before the newsletter is created.
C
Schedule a weekly query in BigQuery to compute the success metric.
D
Schedule a daily Dataflow job in Cloud Composer to compute the success metric.
No comments yet.