Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
When developing a CI/CD pipeline for deploying Databricks notebooks from development to production environments, which best practice ensures the notebooks are automatically tested for functionality and performance before deployment?
A
Manually export/import notebooks between workspaces, conducting performance testing in the production environment.
B
Implement Azure DevOps Pipelines to run automated tests defined in a separate test suite, deploying notebooks to production upon successful test completion.
C
Use Databricks Repos to manually copy notebooks between workspaces, relying on peer reviews for testing.
D
Configure Git integration with Databricks workspace, pushing changes directly to production branches without automated tests.