Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
In the event of data corruption in a Delta Lake table caused by a faulty data ingestion job, how can Delta Lake's time travel feature in Databricks be utilized to pinpoint and revert the table to its pre-corruption state?
A
Applying the VERSION AS OF option in a Delta table read operation to audit data changes and manually correct the corrupted data
B
Utilizing the DESCRIBE HISTORY command to identify the corrupted version and the RESTORE command to revert to a previous version
C
Implementing a Spark SQL query with the TIMESTAMP AS OF clause to access historical data and identify discrepancies
D
Leveraging MLflow to track data versioning and rollback the Delta table to a stable state before the corruption