Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
How can you optimize a Spark Structured Streaming application for low-latency processing of financial transactions to ensure minimal processing time per micro-batch without compromising stateful accuracy?
A
Utilize trigger(Trigger.ProcessingTime("1 second")) and fine-tune spark.streaming.blockInterval.
trigger(Trigger.ProcessingTime("1 second"))
spark.streaming.blockInterval
B
Enable spark.streaming.receiver.maxRate and set a high spark.sql.streaming.metricsEnabled value.
spark.streaming.receiver.maxRate
spark.sql.streaming.metricsEnabled
C
Implement stateful transformations using mapGroupsWithState with a low trigger interval.
mapGroupsWithState
D
Opt for flatMapGroupsWithState with explicit state timeout settings and adjust spark.sql.streaming.schemaInference.
flatMapGroupsWithState
spark.sql.streaming.schemaInference