
Answer-first summary for fast verification
Answer: Migrate the existing Airflow orchestration configuration into Amazon Managed Workflows for Apache Airflow (Amazon MWAA). Create the data quality checks during the ingestion to validate the data quality by using SQL tasks in Airflow.
Because the company already uses Apache Airflow on-premises, migrating to Amazon Managed Workflows for Apache Airflow (Amazon MWAA) allows the company to use an AWS managed service with the least amount of refactoring. Their existing Airflow pipelines and SQL tasks can largely be ported over as is.
Author: Ritesh Yadav
Ultimate access to all questions.
Question 35
A company uses Apache Airflow to orchestrate the company's current on-premises data pipelines. The company runs SQL data quality check tasks as part of the pipelines. The company wants to migrate the pipelines to AWS and to use AWS managed services. Which solution will meet these requirements with the LEAST amount of refactoring?
A
Setup AWS Outposts in the AWS Region that is nearest to the location where the company uses Airflow. Migrate the servers into Outposts hosted Amazon EC2 instances. Update the pipelines to interact with the Outposts hosted EC2 instances instead of the on-premises pipelines.
B
Create a custom Amazon Machine Image (AMI) that contains the Airflow application and the code that the company needs to migrate. Use the custom AMI to deploy Amazon EC2 instances. Update the network connections to interact with the newly deployed EC2 instances.
C
Migrate the existing Airflow orchestration configuration into Amazon Managed Workflows for Apache Airflow (Amazon MWAA). Create the data quality checks during the ingestion to validate the data quality by using SQL tasks in Airflow.
D
Convert the pipelines to AWS Step Functions workflows. Recreate the data quality checks in SQL as Python-based AWS Lambda functions.
No comments yet.