
Answer-first summary for fast verification
Answer: Use the 'Data Flow' activity with a 'Source' setting that can handle multiple data formats.
Option B is the correct approach as it uses the 'Data Flow' activity, which can handle multiple data formats through its 'Source' setting. This allows for a single pipeline to process data from multiple sources with varying formats, simplifying the overall architecture. Option A requires a custom connector for each data source, which may not be efficient or scalable. Option C introduces additional complexity by requiring a custom data processing application, while Option D is not the most efficient solution as it requires multiple pipelines for each data source and format.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
In a scenario where you need to implement a data pipeline that processes data from multiple sources with varying data formats, which of the following Azure Data Factory features would you use to handle this requirement?
A
Use the 'Copy Data' activity with a custom connector for each data source.
B
Use the 'Data Flow' activity with a 'Source' setting that can handle multiple data formats.
C
Create a custom data processing application that can read and transform data from all sources before loading it into Azure Data Factory.
D
Use the 'Execute Pipeline' activity to call multiple pipelines, each configured for a specific data source and format.