
Answer-first summary for fast verification
Answer: Newly updated records will be appended to the target table.
Databricks supports reading table changes captured by Change Data Feed (CDF) in streaming queries using `spark.readStream`. This feature allows you to retrieve only the new changes captured since the last execution of the streaming query. The query in question appends the data to the target table at each execution because it uses the default writing mode, which is 'append'. For more details, refer to the [Databricks documentation on Delta Change Data Feed](https://docs.databricks.com/delta/delta-change-data-feed.html#read-changes-in-streaming-queries).
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
When executing the following query on the Delta table 'customers' with Change Data Feed enabled, what describes the results each time the query is run?
.option("readChangeFeed", "true")
.option("startingVersion", 0)
.table("customers")
.filter(col("_change_type").isin(["update_postimage"]))
.writeStream
.option("checkpointLocation", "dbfs:/checkpoints")
.trigger(availableNow=True)
.table("customers_updates")
.option("readChangeFeed", "true")
.option("startingVersion", 0)
.table("customers")
.filter(col("_change_type").isin(["update_postimage"]))
.writeStream
.option("checkpointLocation", "dbfs:/checkpoints")
.trigger(availableNow=True)
.table("customers_updates")
A
Newly updated records will be merged into the target table, modifying previous entries with the same primary keys.
B
Newly updated records will overwrite the target table.
C
The entire history of updated records will be appended to the target table at each execution, which leads to duplicate entries.
D
Newly updated records will be appended to the target table.
E
The entire history of updated records will overwrite the target table at each execution.
No comments yet.