Ultimate access to all questions.
Upgrade Now 🚀
Sign in to unlock AI tutor
Explain the concept of feature importance in ensemble learning. How do bagging, boosting, and stacking determine the importance of features in their base models?
A
Bagging averages feature importance across models, boosting assigns importance based on error correction, and stacking combines feature importance from multiple models.
B
Feature importance is not relevant in ensemble methods. All methods rely on random feature selection.
C
Feature importance is identical in all ensemble methods. All methods use the same algorithm to determine feature importance.
D
Feature importance is only relevant for boosting. Bagging and stacking do not consider feature importance.