Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
In a scenario where you are tasked with processing real-time data streams using Azure Stream Analytics, how would you ensure that the data ingestion process avoids duplicates and guarantees exactly-once delivery semantics?
A
Use the checkpointing feature of Azure Stream Analytics to periodically save the state of the stream.
B
Implement a custom logic to track the unique identifiers of each event and filter out duplicates manually.
C
Configure the input data stream to have a unique partition key and use the Event Hub Capture feature.
D
Enable the Exactly Once Delivery feature in the Azure Stream Analytics job configuration.