
Answer-first summary for fast verification
Answer: Use Azure Databricks to read the data from the sources, perform data format conversion using its built-in functions, and write the results to the Delta Lake.
Option B is the correct approach as it leverages Azure Databricks for reading, transforming, and writing data to a Delta Lake. Azure Databricks provides built-in functions for data format conversion, allowing for efficient handling of data from multiple sources with varying formats. Options A, C, and D do not provide the same level of control and flexibility for data format conversion in the context of a Delta Lake.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You are working on a batch processing solution that needs to handle data from multiple sources with varying data formats. The solution should be able to read the data, transform it into a unified format, and write it to a Delta Lake. How would you implement this functionality?
A
Use Azure Data Factory to orchestrate the data flow and use its data transformation activities to convert the data into a unified format.
B
Use Azure Databricks to read the data from the sources, perform data format conversion using its built-in functions, and write the results to the Delta Lake.
C
Use Azure SQL Data Warehouse to store the data and perform data format conversion using T-SQL.
D
Use Azure Cosmos DB to store the data and perform data format conversion using its built-in support for multiple data models.
No comments yet.