
Answer-first summary for fast verification
Answer: Use the Data Flow activity and configure the 'Data Quality' tab to define and apply data quality checks.
Option B, the Data Flow activity, is the most suitable for implementing data quality checks in Azure Data Factory. By using the 'Data Quality' tab within the Data Flow, you can define and apply various data quality checks, such as column pattern matching, data type validation, and data profiling. This approach allows for flexible and efficient data quality monitoring and ensures that the data meets the required quality standards before further processing or storage.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
You are working on a data transformation project using Azure Data Factory and need to perform a series of data quality checks on a dataset. Which of the following Azure Data Factory features would you use to implement these data quality checks, and how would you configure the feature to handle different data quality issues?
A
Use the Copy Data activity and configure the 'Validation' settings to perform data quality checks during the copy process.
B
Use the Data Flow activity and configure the 'Data Quality' tab to define and apply data quality checks.
C
Use the Execute SSIS Package activity and import the SSIS package that contains the data quality checks.
D
Use the Lookup activity to retrieve the data quality rules and then use a custom activity to perform the data quality checks.