
Google Professional Machine Learning Engineer
Get started today
Ultimate access to all questions.
You are leading a machine learning project aimed at predicting housing prices. The dataset includes features such as square footage (ranging from 500 to 10,000), number of bedrooms (1 to 5), and age of the property (1 to 100 years). During the training phase of your Neural Network model, you observe that the gradient optimization is struggling to converge, likely due to the varying scales of the features. Considering the need for a solution that ensures numerical stability and enhances convergence without compromising the integrity of the data, what is the most effective action to take? Choose one correct option.
You are leading a machine learning project aimed at predicting housing prices. The dataset includes features such as square footage (ranging from 500 to 10,000), number of bedrooms (1 to 5), and age of the property (1 to 100 years). During the training phase of your Neural Network model, you observe that the gradient optimization is struggling to converge, likely due to the varying scales of the features. Considering the need for a solution that ensures numerical stability and enhances convergence without compromising the integrity of the data, what is the most effective action to take? Choose one correct option.
Explanation:
The most effective action to address the issue of varying feature scales affecting gradient optimization in a Neural Network is to apply feature scaling techniques like normalization. This approach standardizes the ranges of all features, ensuring that no single feature disproportionately influences the gradient updates due to its scale. Benefits include improved numerical stability and faster convergence during the optimization process. Other options, such as removing features with missing values or altering the dataset partition, do not directly solve the scale disparity issue. Combining features might reduce dimensionality but does not necessarily address the scale variation problem.