Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
In a scenario where you need to implement a data pipeline that processes data from multiple sources with varying data formats, which of the following Azure Data Factory features would you use to handle this requirement?
A
Use the 'Copy Data' activity with a custom connector for each data source.
B
Use the 'Data Flow' activity with a 'Source' setting that can handle multiple data formats.
C
Create a custom data processing application that can read and transform data from all sources before loading it into Azure Data Factory.
D
Use the 'Execute Pipeline' activity to call multiple pipelines, each configured for a specific data source and format.