
Google Professional Machine Learning Engineer
Get started today
Ultimate access to all questions.
During the process of batch training a neural network, you observe that the training loss shows oscillatory behavior, indicating instability in the learning process. This could prevent the model from converging to an optimal solution. What modification should you apply to your model's hyperparameters to help it converge more reliably?
During the process of batch training a neural network, you observe that the training loss shows oscillatory behavior, indicating instability in the learning process. This could prevent the model from converging to an optimal solution. What modification should you apply to your model's hyperparameters to help it converge more reliably?
Explanation:
The correct answer is B. When you observe oscillations in the training loss during batch training of a neural network, it typically indicates that the learning rate is too high. A high learning rate causes the model parameters to update too aggressively, leading to instability and oscillations in the loss. Lowering the learning rate hyperparameter will allow the model to adjust more gradually, promoting more stable convergence.