
Ultimate access to all questions.
You are designing an Azure Data Factory pipeline to ingest and transform data from an Azure Cosmos DB to Azure Blob Storage. The data includes a large collection of customer profiles that need to be partitioned by date of birth. Which activities and connectors would you use in the pipeline to achieve this, and how would you configure the pipeline to handle the partitioning?
A
Use a Copy Data activity to copy the entire collection from Cosmos DB to Blob Storage without any partitioning.
B
Use a Data Flow activity to read the collection, partition it by date of birth, and then write each partition to Blob Storage.
C
Use a Lookup activity to retrieve the distinct dates of birth, followed by a ForEach activity to iterate over the dates and a Copy Data activity to copy each partition to Blob Storage.
D
Use a Web activity to trigger an external script that partitions the collection before copying it to Blob Storage.