
Ultimate access to all questions.
You have created three distinct data processing jobs as part of your data pipeline management tasks:
You need a solution that allows you to schedule and monitor the execution of these three workflows and also gives you the capability to manually trigger them when necessary. What approach should you take?
A
Create a Direct Acyclic Graph in Cloud Composer to schedule and monitor the jobs.
B
Use Stackdriver Monitoring and set up an alert with a Webhook notification to trigger the jobs.
C
Develop an App Engine application to schedule and request the status of the jobs using GCP API calls.
D
Set up cron jobs in a Compute Engine instance to schedule and monitor the pipelines using GCP API calls.