Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
Why is standardizing or normalizing features crucial in Spark ML data preprocessing?
A
It simplifies feature selection
B
It adds complexity to the model
C
It ensures all features are on a similar scale, preventing any particular feature from dominating
D
It slows down model training