
Answer-first summary for fast verification
Answer: Configure Databricks to output logs to Azure Log Analytics and set up alert rules for pipeline failure events.
The most effective method for immediate notification of pipeline failures is to configure Databricks to output logs to Azure Log Analytics and set up alert rules for pipeline failure events. This approach leverages the integration between Databricks and Azure services for real-time monitoring and alerting. Option A requires additional development effort and may not be as efficient. Option B may not provide real-time alerts as it focuses on tracking executions rather than direct log monitoring. Option D is inefficient due to the delays inherent in manual monitoring and reporting.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
When setting up automated monitoring for a data pipeline in Azure Databricks, which method ensures the data engineering team receives immediate notifications of pipeline failures?
A
Use Databricks job logs with a custom Azure Function to parse logs and send alerts via Microsoft Teams.
B
Implement Azure Monitor with Application Insights to track pipeline executions and configure email alerts for failures.
C
Configure Databricks to output logs to Azure Log Analytics and set up alert rules for pipeline failure events.
D
Rely on manual monitoring of Databricks workspace for job failures and report issues via email.
No comments yet.