
Answer-first summary for fast verification
Answer: Export the Dataprep job as a Dataflow template, and incorporate it into a Cloud Composer job.
Exporting the Dataprep job as a Dataflow template enables the recipe to be part of a Dataflow pipeline, which can then be scheduled and executed with Cloud Composer. Cloud Composer offers a fully managed workflow orchestration service, featuring advanced scheduling, monitoring, and retry capabilities for data pipelines. This approach surpasses the simplicity of a cron schedule or an App Engine cron job by offering enhanced flexibility and control. While option D (exporting the recipe as a Dataprep template and scheduling with Cloud Scheduler) is viable, leveraging Dataflow with Cloud Composer provides superior pipeline management and monitoring features.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You have a Dataprep recipe initially created from a sample of data in a BigQuery table. What is the best method to apply this recipe daily to new data with the same schema, following a variable load job's completion?
A
Schedule a Dataprep job using a cron schedule
B
Use an App Engine cron job to schedule the execution of the Dataprep job
C
Export the Dataprep job as a Dataflow template, and incorporate it into a Cloud Composer job.
D
Export the recipe as a Dataprep template, and schedule a job using Cloud Scheduler
No comments yet.