Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
How would you design a load test to validate the performance of a global Azure Databricks solution handling data ingestion and processing from geographically distributed sources?
A
Manually distributing datasets across regions and observing the impact on a centralized Databricks workspace without automation.
B
Utilizing a VPN to mimic geographical latencies and network conditions while ingesting data into a central Databricks workspace for processing.
C
Deploying identical Databricks workspaces in multiple Azure regions, using Azure Traffic Manager to simulate geographically distributed data ingestion and analyzing regional performance metrics.
D
Simulating geo-distributed data sources with Azure Event Hubs in different regions, directing data to a single Databricks workspace, and monitoring latency and throughput.