
Answer-first summary for fast verification
Answer: Every half second
By default, if you don’t provide any trigger interval, the data will be processed every half second. This is equivalent to `trigger(processingTime='500ms')`. Reference: [Databricks Documentation on Triggers](https://docs.databricks.com/structured-streaming/triggers.html#what-is-the-default-trigger-interval).
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
Given the following Structured Streaming query:
spark.readStream
.table("orders")
.writeStream
.option("checkpointLocation", checkpointPath)
.table("Output_Table")
spark.readStream
.table("orders")
.writeStream
.option("checkpointLocation", checkpointPath)
.table("Output_Table")
Which of the following describes the trigger interval for this query?
A
Every half second
B
Every half hour
C
The query will run in batch mode to process all available data at once, then the trigger stops.
D
More information is needed to determine the correct response.
E
Every half min
No comments yet.