Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
In your batch processing solution, you need to read from and write to a Delta Lake. You want to ensure that the data is stored in an organized and efficient manner, allowing for easy querying and analysis. How would you approach this task?
A
Use Azure Data Factory to orchestrate the data flow and use the Copy Data activity to read from and write to the Delta Lake.
B
Use Azure Databricks to read from and write to the Delta Lake using its built-in support for Delta Lake.
C
Use Azure Stream Analytics to read from the data sources and write the results to the Delta Lake using its output connectors.
D
Use Azure Functions to read from the data sources and write the results to the Delta Lake using custom code.