
Answer-first summary for fast verification
Answer: Use Azure Databricks for big data processing and implement Azure Data Lake Storage (ADLS) with encryption at rest and Azure Active Directory (AAD) for authentication and authorization.
Option A is the correct answer as it involves using Azure Databricks for big data processing and implementing Azure Data Lake Storage (ADLS) with encryption at rest and Azure Active Directory (AAD) for authentication and authorization. This approach ensures that the data is secure and compliant with data protection regulations. Option B is incorrect as it does not address encryption at rest for the data stored in HDInsight. Option C is partially correct but does not address the specific security measures for big data processing. Option D is incorrect as it focuses on using Azure Cosmos DB, which is not primarily designed for big data analytics.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
In a scenario where you need to implement data security for a big data analytics environment in Azure, describe the steps you would take to ensure that the data is secure and compliant with data protection regulations.
A
Use Azure Databricks for big data processing and implement Azure Data Lake Storage (ADLS) with encryption at rest and Azure Active Directory (AAD) for authentication and authorization.
B
Configure Azure HDInsight to use Azure Private Link for secure access and implement Azure Information Protection (AIP) to classify and protect sensitive data.
C
Utilize Azure Data Factory to orchestrate data movement between big data services and apply Azure Policy to enforce data security policies for data governance and compliance.
D
Implement Azure Cosmos DB for storing and processing big data and use Azure Key Vault to store and manage encryption keys for data at rest and in transit.