
Ultimate access to all questions.
You are tasked with developing a batch processing solution for a healthcare organization that needs to process patient data daily. The solution should ensure data privacy and security, handle upserts, and configure exception handling. Additionally, the organization wants to integrate Jupyter or Python notebooks into the data pipeline for data exploration and analysis. How would you approach this task?
A
Use Azure Data Factory to orchestrate the data pipeline and leverage Azure Data Lake Storage Gen2 for storing the raw data, ensuring data privacy and security through encryption and access control.
B
Use Azure Databricks to process the data using its built-in support for Delta Lake, and integrate Jupyter notebooks for data exploration and analysis, while implementing data privacy and security measures.
C
Use Azure Stream Analytics to process the data in real-time and store the results in Azure SQL Database, ensuring data privacy and security through built-in features.
D
Use Azure Functions to process the data in small batches and store the results in Azure Cosmos DB, implementing data privacy and security through custom code.