Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
You are tasked with reading from and writing to a Delta Lake in a batch processing solution. Describe the steps you would take to ensure data consistency and performance optimization, including any specific configurations or tools you would use.
A
Use Azure Data Factory for data movement, configure parallel processing, and implement data validation checks.
B
Manually read and write data using Python scripts, without any specific configurations for performance.
C
Use a single thread to read and write data sequentially to avoid data inconsistencies.
D
Copy data to a different storage format before processing to improve performance.