
Answer-first summary for fast verification
Answer: Use Azure Databricks to process the data and configure the retention policy using Azure Data Lake Storage Gen2's lifecycle management features.
Option B is the correct approach as it leverages Azure Databricks for data processing and Azure Data Lake Storage Gen2 for storage. Azure Data Lake Storage Gen2 provides lifecycle management features that can be used to configure retention policies for the processed data. This ensures that the data is retained for the desired duration and then automatically deleted, managing the storage and lifecycle effectively. Options A, C, and D do not provide the same level of control and flexibility for configuring batch retention policies.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You are working on a batch processing solution that requires configuring batch retention policies to manage the storage and lifecycle of the processed data. How would you implement this functionality?
A
Use Azure Data Factory's data flow activities to process the data and configure the retention policy using Azure Blob Storage's lifecycle management features.
B
Use Azure Databricks to process the data and configure the retention policy using Azure Data Lake Storage Gen2's lifecycle management features.
C
Use Azure SQL Database to store the processed data and configure the retention policy using its built-in data retention features.
D
Use Azure Cosmos DB to store the processed data and configure the retention policy using its time-to-live (TTL) feature.
No comments yet.