
Answer-first summary for fast verification
Answer: Implementing Azure Databricks‘ native logging to capture and store data quality metrics in Azure Log Analytics
The most efficient and scalable approach to track and log data quality issues during ETL processes in Azure Databricks is by leveraging Azure Databricks‘ native logging capabilities to capture and store these metrics in Azure Log Analytics. This method provides a centralized platform for monitoring and analyzing data quality issues, enabling easy querying, visualization, and alerting. While other methods like printing statements in notebooks or writing logs directly to a database table offer some level of tracking, they lack the robustness and scalability. Similarly, sending metrics to Azure Event Hubs is more suited for real-time event processing rather than comprehensive data quality logging.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
How can your team effectively track and log data quality issues encountered during ETL processes in Azure Databricks to ensure high data quality?
A
Sending data quality metrics from Databricks to Azure Event Hubs for real-time monitoring
B
Utilizing Databricks notebooks to print statements for any data quality issues
C
Implementing Azure Databricks‘ native logging to capture and store data quality metrics in Azure Log Analytics
D
Directly writing logs to a database table with error details and timestamps
No comments yet.