
Answer-first summary for fast verification
Answer: Convert the Dataprep job into a Dataflow template and integrate it into a Cloud Composer workflow.
Transforming the Dataprep job into a Dataflow template and incorporating it into a Cloud Composer workflow offers the most robust solution. Cloud Composer, being a fully managed workflow orchestration service, excels in scheduling, monitoring, and retrying data pipelines, providing superior flexibility and control compared to simpler cron-based scheduling or App Engine cron jobs. While saving the recipe as a Dataprep template and scheduling via Cloud Scheduler is feasible, leveraging Dataflow with Cloud Composer enhances pipeline management and monitoring capabilities.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
You have a Dataprep recipe initially designed with a sample dataset from a BigQuery table. What is the best method to apply this recipe daily to new data sharing the same schema, following a variable load job's completion?
A
Schedule the Dataprep job directly using a cron schedule.
B
Utilize an App Engine cron job to automate the Dataprep job execution.
C
Convert the Dataprep job into a Dataflow template and integrate it into a Cloud Composer workflow.
D
Save the recipe as a Dataprep template and use Cloud Scheduler to trigger the job.