
Answer-first summary for fast verification
Answer: Utilize the Databricks REST API to programmatically export the notebook along with its metadata, including dependencies and access control settings, from the source workspace, and import it into the target workspace, automatically preserving all configurations and data objects.
Option B is the most efficient and reliable method for migrating a Databricks notebook between workspaces. By using the Databricks REST API, the entire notebook, including its metadata, dependencies, access control settings, and associated data objects, can be exported and imported seamlessly. This approach minimizes the risk of human error, ensures compliance by preserving all original configurations, and reduces downtime by automating the migration process. Manual methods (Options A, C, and D) are more prone to errors and do not guarantee the preservation of all settings, making them less suitable for critical migrations.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
In the context of migrating a Databricks notebook from one workspace to another within Microsoft Azure, consider the following scenario: Your organization is consolidating its Databricks workspaces to optimize costs and improve governance. You are responsible for migrating a critical notebook that includes complex dependencies, specific access control settings, and associated data objects. The migration must ensure minimal downtime, preserve all configurations, and adhere to compliance requirements. Which of the following approaches BEST meets these requirements? Choose one option.
A
Manually copy the notebook and its code to the new workspace, and recreate the dependencies, access control settings, and data objects from scratch, ensuring to document each step for compliance purposes.
B
Utilize the Databricks REST API to programmatically export the notebook along with its metadata, including dependencies and access control settings, from the source workspace, and import it into the target workspace, automatically preserving all configurations and data objects.
C
Export the notebook as a JSON file using the Databricks CLI from the source workspace, and import it into the target workspace, then manually adjust the dependencies, access control settings, and data objects to match the original setup.
D
Develop a new notebook in the target workspace by copying the code from the source notebook, and manually configure each dependency, access control setting, and data object, verifying each step against compliance checklists.