A data governance team is reviewing a process for removing user data from a Databricks Lakehouse to comply with GDPR requirements. The following code snippet is used to propagate deletions from a `lookup_table` to an `aggregates_table` using the Delta Lake Change Data Feed (CDF): ```python (spark.read .format("delta") .option("readChangeData", True) .option("startingTimestamp", '2024-08-22 00:00:00') .option("endingTimestamp", '2024-08-29 00:00:00') .table("lookup_table") .createOrReplaceTempView("changes")) spark.sql(""" DELETE FROM aggregates_table WHERE user_id IN ( SELECT user_id FROM changes WHERE change_type='delete' ) """) ``` After this code executes successfully, will the deleted records in `aggregates_table` be permanently inaccessible? Why or why not? | Databricks Certified Data Engineer - Professional Quiz - LeetQuiz