Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
As a data engineer, you are troubleshooting a failed Spark job. The job has failed with an OutOfMemoryError. What steps would you take to diagnose and resolve this issue?
A
Increase the memory allocated to the Spark executors.
B
Check the logs for any specific error messages related to the OutOfMemoryError.
C
Optimize the Spark job by reducing the data shuffling across the nodes.
D
Restart the Spark job without making any changes.