
Ultimate access to all questions.
Given a scenario where a data pipeline in Azure Data Factory needs to process data from multiple sources and destinations, describe how you would manage the data pipelines to ensure efficient and reliable data movement and transformation, including the use of linked services, datasets, and activities.
A
Create separate pipelines for each source and destination pair.
B
Use a single, monolithic pipeline that includes all sources and destinations.
C
Design modular pipelines with reusable linked services and datasets, and orchestrate these using control flow activities.
D
Manually copy data from each source to each destination using individual activities.