Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
In a stream processing solution, you need to process data with a high degree of complexity and requires real-time anomaly detection. How would you approach this task to ensure efficient and accurate processing?
A
Use a simple processing approach to handle the complexity of the data.
B
Use a distributed processing framework like Spark to process the data and perform real-time anomaly detection.
C
Use a distributed processing framework like Spark to process the data, perform real-time anomaly detection, and create windowed aggregates.
D
Use a distributed processing framework like Spark to process the data, perform real-time anomaly detection, create windowed aggregates, and handle schema drift.