Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
Explain the concept of bias-variance tradeoff in the context of ensemble learning. How do bagging, boosting, and stacking address this tradeoff?
A
Bagging reduces variance, boosting reduces bias, and stacking balances both. Ensemble methods improve performance by addressing the bias-variance tradeoff.
B
Bagging increases variance, boosting reduces bias, and stacking increases model complexity. Ensemble methods do not address the bias-variance tradeoff.
C
Bagging and boosting are identical in addressing the bias-variance tradeoff. Stacking is used for reducing model complexity.
D
Ensemble methods do not affect the bias-variance tradeoff. They only increase model complexity.