
Ultimate access to all questions.
In machine learning, a model's complexity is often determined by the number of features it incorporates. Which of the following statements correctly describes the bias-variance trade-off for large models with many features versus smaller models with fewer features?
A
Large models with many features have low bias and low variance, while smaller models with fewer features have high bias and high variance
B
Large models with many features have high bias and high variance, while smaller models with fewer features have low bias and low variance
C
Large models with many features have low bias and high variance, while smaller models with fewer features have high bias and low variance
D
Large models with many features have high bias and low variance, while smaller models with fewer features have low bias and high variance
Explanation:
In machine learning, the bias-variance trade-off is a fundamental concept that describes the relationship between model complexity and performance:
This bias-variance trade-off is crucial in machine learning model design, where practitioners must balance model complexity to achieve optimal performance on both training and test data.