
Ultimate access to all questions.
A Databricks job is configured with three tasks: Task A runs independently, while Task B and Task C run in parallel once Task A completes. If Task A and Task B finish successfully but Task C encounters a failure, what is the resulting state of the data in the Lakehouse?
A
No changes will be saved to the Lakehouse unless all tasks are successful; the failure of Task C will trigger an automatic rollback of all changes made by Tasks A and B.
B
As tasks are managed based on dependencies, changes are held in a temporary state and won’t be committed to the Lakehouse until every task in the job finishes successfully.
C
The logic in the notebooks for Tasks A and B will be completed and committed successfully, but any operations performed by Task C might have been partially completed or remain partially persisted.
D
The logic in the notebooks for Tasks A and B will be completed successfully, but all changes made by Task C will be automatically undone by the Databricks Jobs service to maintain environment consistency.