Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
What indicators in the Spark UI's Storage tab should a data engineer monitor to identify suboptimal performance of a cached table when using the MEMORY_ONLY storage level?
MEMORY_ONLY
A
On Heap Memory Usage is within 75% of Off Heap Memory Usage
B
The RDD Block Name includes the “*” annotation signaling a failure to cache
C
Size on Disk is > 0
D
The number of Cached Partitions > the number of Spark Partitions