
Ultimate access to all questions.
In the context of migrating a Databricks notebook from one workspace to another within Microsoft Azure, consider the following scenario: Your organization is consolidating its Databricks workspaces to optimize costs and improve governance. You are responsible for migrating a critical notebook that includes complex dependencies, specific access control settings, and associated data objects. The migration must ensure minimal downtime, preserve all configurations, and adhere to compliance requirements. Which of the following approaches BEST meets these requirements? Choose one option.
A
Manually copy the notebook and its code to the new workspace, and recreate the dependencies, access control settings, and data objects from scratch, ensuring to document each step for compliance purposes.
B
Utilize the Databricks REST API to programmatically export the notebook along with its metadata, including dependencies and access control settings, from the source workspace, and import it into the target workspace, automatically preserving all configurations and data objects.
C
Export the notebook as a JSON file using the Databricks CLI from the source workspace, and import it into the target workspace, then manually adjust the dependencies, access control settings, and data objects to match the original setup.
D
Develop a new notebook in the target workspace by copying the code from the source notebook, and manually configure each dependency, access control setting, and data object, verifying each step against compliance checklists.