
Ultimate access to all questions.
You are designing an Azure Data Factory pipeline to ingest and transform data from an Azure Blob Storage container to Azure SQL Database. The data includes a large CSV file that needs to be split into smaller files based on a specific column, 'Region'. Which activities and connectors would you use in the pipeline to achieve this, and how would you configure the pipeline to handle the splitting and transformation?
A
Use a Copy Data activity to copy the entire CSV file from Blob Storage to SQL Database without any splitting.
B
Use a Data Flow activity to read the CSV file, split it by 'Region', and then write each split file to SQL Database.
C
Use a Lookup activity to retrieve the distinct regions, followed by a ForEach activity to iterate over the regions and a Copy Data activity to copy each region's data to SQL Database.
D
Use a Web activity to trigger an external script that splits the CSV file before copying it to SQL Database.