
Ultimate access to all questions.
You are working on a batch processing solution that needs to handle data from multiple sources with different schemas. The solution should be able to read the data, transform it into a unified schema, and write it to a Delta Lake. How would you implement this functionality?
A
Use Azure Data Factory to orchestrate the data flow and use the Copy Data activity to read and transform the data.
B
Use Azure Databricks to read the data from the sources, perform schema mapping and transformation using its built-in functions, and write the results to the Delta Lake.
C
Use Azure SQL Data Warehouse to store the data and perform schema mapping and transformation using T-SQL.
D
Use Azure Cosmos DB to store the data and perform schema mapping and transformation using its built-in support for multiple data models.