
Answer-first summary for fast verification
Answer: Increase the number of streaming units (SUs).
The Backlogged Input Events metric in Azure Stream Analytics indicates that the job is receiving more events than it can process, leading to a growing queue of unprocessed events. This backlog occurs when the job's processing capacity is insufficient for the incoming event rate. **Why Option B is correct:** - Increasing the number of Streaming Units (SUs) directly scales the processing capacity of the Stream Analytics job - Each SU provides approximately 1 MB/second of processing throughput - Scaling up SUs allows the job to process events faster, reducing and eventually eliminating the backlog - This is the primary scaling mechanism for handling increased event volumes in Stream Analytics **Why other options are not suitable:** - **Option A (Change compatibility level):** Compatibility level affects query language features and behavior, not processing capacity or throughput - **Option C (Use $default consumer group):** Consumer group configuration affects how events are read from Event Hubs, but doesn't address insufficient processing capacity - **Option D (Create additional output stream):** Adding outputs doesn't increase processing capacity for input events; it only creates additional destinations for processed data The most effective solution for a consistently growing backlog is to increase the job's processing capacity by scaling up the Streaming Units.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You are monitoring an Azure Stream Analytics job and observe that the Backlogged Input Events metric is gradually increasing and remains consistently above zero. What should you do to ensure the job can process all events?
A
Change the compatibility level of the Stream Analytics job.
B
Increase the number of streaming units (SUs).
C
Remove any named consumer groups from the connection and use $default.
D
Create an additional output stream for the existing input stream.
No comments yet.