
Ultimate access to all questions.
You are designing an Azure Data Factory pipeline to ingest and transform data from an on-premises SQL Server to Azure Blob Storage. The data includes a large table of customer transactions that needs to be partitioned by date. Which activities and connectors would you use in the pipeline to achieve this, and how would you configure the pipeline to handle the partitioning?
A
Use a Copy Data activity with SQL Server as the source and Blob Storage as the destination, without any partitioning.
B
Use a Lookup activity to retrieve the partition keys, followed by a ForEach activity to iterate over the partitions and a Copy Data activity to transfer each partition.
C
Use a single Copy Data activity with SQL Server as the source and Blob Storage as the destination, and configure the sink to partition the data by date.
D
Use a Web activity to trigger an external script that partitions the data before copying it to Blob Storage.