
Answer-first summary for fast verification
Answer: Use Azure Databricks to read from and write to the Delta Lake using its built-in support for Delta Lake.
Option B is the correct approach as it leverages Azure Databricks' built-in support for Delta Lake. Azure Databricks provides a native integration with Delta Lake, allowing for efficient reading and writing of data. This ensures that the data is stored in an organized manner, with support for features such as schema evolution, data versioning, and ACID transactions. Options A, C, and D do not provide the same level of integration and efficiency for working with Delta Lake.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
In your batch processing solution, you need to read from and write to a Delta Lake. You want to ensure that the data is stored in an organized and efficient manner, allowing for easy querying and analysis. How would you approach this task?
A
Use Azure Data Factory to orchestrate the data flow and use the Copy Data activity to read from and write to the Delta Lake.
B
Use Azure Databricks to read from and write to the Delta Lake using its built-in support for Delta Lake.
C
Use Azure Stream Analytics to read from the data sources and write the results to the Delta Lake using its output connectors.
D
Use Azure Functions to read from the data sources and write the results to the Delta Lake using custom code.