Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
How can your team effectively track and log data quality issues encountered during ETL processes in Azure Databricks to ensure high data quality?
A
Sending data quality metrics from Databricks to Azure Event Hubs for real-time monitoring
B
Utilizing Databricks notebooks to print statements for any data quality issues
C
Implementing Azure Databricks‘ native logging to capture and store data quality metrics in Azure Log Analytics
D
Directly writing logs to a database table with error details and timestamps