
Answer-first summary for fast verification
Answer: Bagging, as it reduces the variance of the model and is less prone to overfitting.
In a scenario with a large number of features and a small number of samples, the model is more likely to overfit. Bagging is a suitable ensemble technique in this case, as it reduces the variance of the model by training multiple models on different subsets of the data. This helps to mitigate the overfitting issue, making option A the correct choice.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
Consider a scenario where you have a dataset with a large number of features and a relatively small number of samples. Which ensemble technique would you recommend, and why?
A
Bagging, as it reduces the variance of the model and is less prone to overfitting.
B
Boosting, as it focuses on the errors made by previous models and can handle noisy data.
C
Stacking, as it combines the strengths of different models and can handle a large number of features.
D
None of the above, as the dataset is too small for ensemble techniques to be effective.
No comments yet.