Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
How would you design a load test for a real-time streaming application using Azure Databricks and Azure Event Hubs to ensure the pipeline can handle peak data velocities and volumes without performance degradation?
A
Deploying Azure Functions to mimic real-time data generation at scale, directing the output to Event Hubs connected to your Databricks streaming job
B
Writing a Databricks notebook to simulate data production into Event Hubs, scaling up the notebook‘s resources to increase load
C
Utilizing a third-party load testing tool to generate high-velocity data streams towards Event Hubs, monitoring pipeline performance in Databricks with custom metrics
D
Configuring Azure Load Tester to simulate real-world traffic patterns and volumes, analyzing the impact on your streaming jobs in Databricks