
Answer-first summary for fast verification
Answer: Export the Dataprep job as a Dataflow template, and incorporate it into a Composer job.
The correct answer is D. This is because the load job has a variable execution time, and triggering the Dataprep job based on the completion of this variable-time load job requires dynamic scheduling. Option D involves exporting the Dataprep job as a Dataflow template and incorporating it into a Cloud Composer (Apache Airflow) job. This allows for checking when the load job has completed and then triggering the Dataprep job, ensuring that the timing dependency is handled effectively.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You have utilized Dataprep to design a recipe for a sample dataset stored in a BigQuery table. Your objective is to implement this recipe daily on newly uploaded data that maintains the same schema. However, you need this to occur after the data load job, which has a variable execution time, finishes. What actions should you take to achieve this?
A
Create a cron schedule in Dataprep.
B
Create an App Engine cron job to schedule the execution of the Dataprep job.
C
Export the recipe as a Dataprep template, and create a job in Cloud Scheduler.
D
Export the Dataprep job as a Dataflow template, and incorporate it into a Composer job.
No comments yet.