
Answer-first summary for fast verification
Answer: Use Azure Data Factory for data movement, configure parallel processing, and implement data validation checks.
Using Azure Data Factory for data movement ensures scalability and reliability. Configuring parallel processing optimizes performance, and implementing data validation checks maintains data consistency. This approach leverages the strengths of Azure services and best practices for data processing.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You are tasked with reading from and writing to a Delta Lake in a batch processing solution. Describe the steps you would take to ensure data consistency and performance optimization, including any specific configurations or tools you would use.
A
Use Azure Data Factory for data movement, configure parallel processing, and implement data validation checks.
B
Manually read and write data using Python scripts, without any specific configurations for performance.
C
Use a single thread to read and write data sequentially to avoid data inconsistencies.
D
Copy data to a different storage format before processing to improve performance.
No comments yet.