
Answer-first summary for fast verification
Answer: a simpler machine-learning model.
## Explanation Overfitting occurs when a machine learning model learns not only the underlying patterns in the training data but also the noise and random fluctuations. This results in excellent performance on training data but poor generalization to new, unseen data. **Why option B is correct:** - **Simpler models** have fewer parameters and less complexity, which makes them less likely to memorize noise in the training data - Simpler models (like linear regression vs. high-degree polynomial regression) have higher bias but lower variance, reducing overfitting - Techniques like regularization, pruning decision trees, or reducing model complexity are standard approaches to mitigate overfitting **Why other options are incorrect:** - **Option A (higher computing power):** While more computing power can help with techniques like cross-validation or ensemble methods, it doesn't directly mitigate overfitting. In fact, more computing power might enable training of more complex models that could overfit more. - **Option C (unsupervised machine-learning model):** Unsupervised learning (like clustering) doesn't have labeled outputs to fit to, so the concept of overfitting doesn't apply in the same way. Overfitting is primarily a concern in supervised learning where models learn from labeled training data. **Additional context:** Other common techniques to mitigate overfitting include: 1. **Regularization** (L1/L2 regularization) 2. **Cross-validation** 3. **Early stopping** 4. **Dropout** (for neural networks) 5. **Increasing training data** 6. **Feature selection/reduction** 7. **Ensemble methods** (bagging, boosting) However, among the given options, using a simpler model is the most direct and fundamental approach to reduce overfitting.
Author: LeetQuiz .
Ultimate access to all questions.
No comments yet.