Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
Consider a DataFrame df with a column 'value' that contains both NULL and non-NULL values. You need to count the number of rows where 'value' is NULL. Which Spark function would you use to achieve this?
A
df.filter(col('value').isNull()).count()
B
df.filter(col('value') == NULL).count()
C
df.filter(col('value').isNotNull()).count()
D
df.filter(col('value') != NULL).count()