
Answer-first summary for fast verification
Answer: Design the pipeline with multiple triggers, each tailored to the update frequency of a specific source, and use control flow activities to manage data integration based on these frequencies.
Designing the pipeline with multiple triggers, each aligned with the update frequency of a specific source, ensures efficient handling of varying data inputs. Using control flow activities to manage data integration based on these frequencies optimizes resource usage and processing times, enhancing the overall efficiency of the pipeline.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
In a scenario where a data pipeline in Azure Data Factory needs to handle data from various sources with different update frequencies, describe how you would design the pipeline to manage these varying data inputs efficiently, including the use of triggers and scheduling mechanisms.
A
Use a single Schedule trigger set to run the pipeline at the highest frequency among the sources.
B
Design the pipeline with multiple triggers, each tailored to the update frequency of a specific source, and use control flow activities to manage data integration based on these frequencies.
C
Manually trigger the pipeline each time data from any source is updated.
D
Ignore the update frequencies and process all data at a fixed interval.
No comments yet.