Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
In a machine learning project with Spark ML, your team encounters input features that vary in scale. Which Spark ML-supported technique can standardize these features to ensure they contribute equally to the model?
A
Robust Scaling
B
Feature Importance Scaling
C
Z-Score Normalization
D
Min-Max Scaling