
Answer-first summary for fast verification
Answer: Z-Score Normalization
**Correct Answer: C (Z-Score Normalization)** Z-Score Normalization, or Standardization, is a technique supported by Spark ML for standardizing features. It scales each feature so that it has a mean of 0 and a standard deviation of 1. This standardization ensures that all features are on a comparable scale, preventing those with larger scales from disproportionately influencing the model. While Min-Max Scaling and Robust Scaling are alternative normalization methods, Z-Score Normalization is widely used in Spark ML for its effectiveness in equalizing feature importance. Feature Importance Scaling is not a recognized normalization technique.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
In a machine learning project with Spark ML, your team encounters input features that vary in scale. Which Spark ML-supported technique can standardize these features to ensure they contribute equally to the model?
A
Robust Scaling
B
Feature Importance Scaling
C
Z-Score Normalization
D
Min-Max Scaling
No comments yet.