
Ultimate access to all questions.
A Databricks job is configured with three tasks (notebooks): Task A has no dependencies, while Tasks B and C run in parallel with each depending serially on Task A.
If during a scheduled run Task A and B complete successfully but Task C fails, what will be the final state of the job?
A
All logic expressed in the notebook associated with tasks A and B will have been successfully completed; some operations in task C may have completed successfully.
B
Unless all tasks complete successfully, no changes will be committed to the Lakehouse; because task C failed, all commits will be rolled back automatically.
C
Because all tasks are managed as a dependency graph, no changes will be committed to the Lakehouse until all tasks have successfully been completed.
D
All logic expressed in the notebook associated with tasks A and B will have been successfully completed; any changes made in task C will be rolled back due to task failure.