
Answer-first summary for fast verification
Answer: Use Spark ML's streaming capabilities with low batch intervals; challenges include resource management and data consistency.
For real-time machine learning predictions, leveraging Spark ML's streaming capabilities is essential. By setting low batch intervals, predictions can be made as data streams in. However, this approach comes with challenges such as managing computational resources efficiently to handle the continuous data flow and ensuring data consistency and model accuracy in a dynamic environment.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
No comments yet.
Imagine you are working on a project that requires real-time machine learning predictions. Describe how you would implement a real-time prediction system using Spark ML and what challenges you might encounter in ensuring low latency and high throughput.
A
Implement a batch processing system with Spark ML; challenges include latency due to batch intervals.
B
Use Spark ML's streaming capabilities with low batch intervals; challenges include resource management and data consistency.
C
Develop a standalone scikit-learn model for real-time predictions; challenges include scalability issues.
D
Rely on pre-computed predictions to avoid real-time processing; challenges include outdated predictions.