Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
To ensure data integrity and accuracy before deploying updates to your data transformation jobs in Databricks, which automated testing framework strategy would you implement?
A
Utilize Databricks Jobs API to schedule and run data validation scripts automatically before updates are deployed to production.
B
Develop PyTest test cases that can be run as part of a CI/CD pipeline in Azure DevOps, validating data outputs against expected results.
C
Leverage Databricks MLflow to track data metrics over time, manually reviewing these before approving deployments.
D
Manually execute a set of SQL queries in Databricks notebooks pre- and post-deployment to check data consistency.