Ultimate access to all questions.
You are designing an Azure Data Factory pipeline to ingest and transform data from an Azure Cosmos DB to Azure Blob Storage. The data includes a large collection of customer profiles that need to be partitioned by date of birth. Which activities and connectors would you use in the pipeline to achieve this, and how would you configure the pipeline to handle the partitioning?