Ultimate access to all questions.
In the context of building a data pipeline with Delta Lake on Azure Databricks that involves incremental data processing, consider the following scenario: Your organization requires strict compliance with data governance policies, including the ability to audit all changes made to the data. Additionally, the solution must support debugging efforts by allowing data engineers to trace back to the origin of any data inconsistencies. Given these requirements, what is the significance of tracking the history of table transactions in Delta Lake, and how can you review this history? Choose the best option that addresses both the auditing and debugging needs.