
Answer-first summary for fast verification
Answer: Design modular pipelines with reusable linked services and datasets, and orchestrate these using control flow activities.
Designing modular pipelines with reusable components like linked services and datasets enhances maintainability and scalability. Using control flow activities to orchestrate these modular pipelines ensures efficient and reliable data movement and transformation, reducing complexity and improving manageability.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
Given a scenario where a data pipeline in Azure Data Factory needs to process data from multiple sources and destinations, describe how you would manage the data pipelines to ensure efficient and reliable data movement and transformation, including the use of linked services, datasets, and activities.
A
Create separate pipelines for each source and destination pair.
B
Use a single, monolithic pipeline that includes all sources and destinations.
C
Design modular pipelines with reusable linked services and datasets, and orchestrate these using control flow activities.
D
Manually copy data from each source to each destination using individual activities.