Ultimate access to all questions.
In the context of Azure Databricks, Structured Streaming and Delta Lake enable several design patterns for efficient data processing. Consider a scenario where a financial services company needs to monitor transactions in real-time to detect fraudulent activities immediately. The solution must process data continuously, support event-time processing to handle late-arriving data, and maintain state to track transactions over time. Given these requirements, which design pattern would be the BEST choice to implement? Choose the correct option from the following: