
Ultimate access to all questions.
You are tasked with developing a batch processing solution for a large e-commerce company that needs to process millions of transactions daily. The solution should be able to handle upserts, revert data to a previous state, and configure exception handling. Additionally, the company wants to integrate Jupyter or Python notebooks into the data pipeline for data exploration and analysis. How would you approach this task?
A
Use Azure Data Factory to orchestrate the data pipeline and leverage Azure Databricks for processing and analysis.
B
Use Azure Stream Analytics to process the data in real-time and store the results in Azure SQL Database.
C
Use Azure Data Lake Storage Gen2 to store the raw data and process it using Azure Databricks.
D
Use Azure Functions to process the data in small batches and store the results in Azure Cosmos DB.