Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
Which of the following statements about Spark's reliability is false?
A
Spark is designed to support the loss of any set of worker nodes.
B
Spark will rerun any failed tasks due to failed worker nodes.
C
Spark will recompute data cached on failed worker nodes.
D
Spark will spill data to disk if it does not fit in memory.
E
Spark will reassign the driver to a worker node if the driver’s node fails.