
Answer-first summary for fast verification
Answer: Implement dynamic resource allocation and autoscaling based on real-time metrics to optimize both cost and performance.
Implementing dynamic resource allocation and autoscaling allows the system to adapt to varying workloads, ensuring optimal use of resources and minimizing both cost and performance overhead. This approach helps in maintaining performance while keeping operational costs under control.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
During the deployment of a Spark batch job, you need to ensure that the job can handle large datasets efficiently while adhering to cost constraints. What strategies would you employ to achieve this?
A
Use a high number of small, inexpensive clusters to distribute the load and reduce costs.
B
Implement dynamic resource allocation and autoscaling based on real-time metrics to optimize both cost and performance.
C
Run the batch job on a single, large cluster to minimize latency.
D
Increase the batch interval to reduce processing costs.
No comments yet.