
Answer-first summary for fast verification
Answer: Add dropout or reduce number of parameters
**Explanation:** High variance (overfitting) occurs when a model is too complex and learns the noise in the training data rather than the underlying pattern. To reduce high variance: 1. **Add dropout** - This is a regularization technique that randomly drops units during training, preventing the model from becoming too reliant on specific neurons and reducing overfitting. 2. **Reduce number of parameters** - Simplifying the model architecture by reducing the number of layers or neurons makes the model less complex and less prone to overfitting. **Why other options are incorrect:** - **A. Use a more complex model** - This would actually increase variance (overfitting) rather than reduce it. - **C. Increase learning rate** - This affects the training speed and convergence but doesn't directly address variance issues. It might even cause instability in training. - **D. Reduce dataset size** - This would likely increase variance because with less data, the model is more likely to overfit to the limited training examples. **Additional methods to reduce high variance:** - Regularization techniques (L1, L2) - Early stopping - Data augmentation - Cross-validation - Ensemble methods (bagging)
Author: Ritesh Yadav
Ultimate access to all questions.
No comments yet.