
Answer-first summary for fast verification
Answer: Use data flow transformations to validate data as part of the pipeline.
Option B is the correct approach as it leverages data flow transformations within the pipeline to validate data quality and completeness. This method is automated, scalable, and ensures that data validation is an integral part of the pipeline process. Option A is not feasible for large datasets and lacks objectivity, while Option C is time-consuming and prone to errors. Option D, although possible, is not the most efficient method as it requires custom development and maintenance of validation logic.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
You are tasked with validating batch loads in Azure Data Factory to ensure data quality and completeness. Which of the following methods would you use to perform this validation, and why?
A
Visually inspect the data in the destination dataset.
B
Use data flow transformations to validate data as part of the pipeline.
C
Manually compare the source and destination datasets for discrepancies.
D
Implement custom validation logic using Azure Data Factory functions.