
Answer-first summary for fast verification
Answer: 1. Activate the Airflow REST API and configure Cloud Storage notifications to initiate a Cloud Function. 2. Establish a Private Service Connect (PSC) endpoint. 3. Craft a Cloud Function that interfaces with the Cloud Composer cluster via the PSC endpoint.
The correct approach is outlined in option C. Here's the rationale: 1. **Enable the Airflow REST API**: This step is crucial for allowing external services to initiate DAG runs in Apache Airflow, facilitating the dynamic processing of new files. 2. **Create a Private Service Connect (PSC) endpoint**: This ensures secure, private access to the Cloud Composer cluster, circumventing the need for Internet connectivity. 3. **Develop a Cloud Function for PSC endpoint communication**: This function will leverage the PSC endpoint to interact with the Airflow REST API, ensuring the DAG is triggered upon each new file arrival. Alternatives A, B, and D fall short by either omitting critical components like the Airflow REST API or PSC, or by proposing less secure methods such as VPC Serverless Access for cluster communication in a restricted network environment.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
You are configuring a DAG in a Cloud Composer 2 instance, which uses Apache Airflow to process files from a Cloud Storage bucket individually. The instance resides in a subnetwork without Internet connectivity. Your goal is to activate the DAG dynamically upon the arrival of each new file, instead of relying on a predetermined schedule. How can you implement this reactive triggering mechanism under these constraints?
A
B
C
D