
Ultimate access to all questions.
You are tasked with designing a solution to ingest and transform data from a real-time event stream into an Azure Data Warehouse. The event stream contains user click data, and you need to aggregate the data by user and calculate the total number of clicks per user. Which of the following Azure services and techniques would you use to achieve this, and how would you design the solution to handle the real-time nature of the data?
A
Use Azure Data Factory with a Copy Data activity to ingest the event stream and store it in Azure Blob Storage. Then, use a Data Flow activity to aggregate the data and load it into the Data Warehouse.
B
Use Azure Stream Analytics to ingest the event stream, apply the aggregation logic, and store the results in Azure Data Lake Storage Gen2. Then, use Azure Data Factory to load the aggregated data into the Data Warehouse.
C
Use Azure Event Hubs to capture the event stream, and then use Azure Functions to process the data in real-time and store the aggregated results in Azure Cosmos DB.
D
Use Azure Logic Apps to trigger a workflow for each event in the stream, and then use a custom connector to aggregate the data and store it in the Data Warehouse.