
Answer-first summary for fast verification
Answer: Use L1 regularization to reduce the coefficients of uninformative features to 0.
The answer is B. L1 regularization helps in feature selection by penalizing the weights of features in proportion to the sum of the absolute values of the weights. This technique effectively reduces the coefficients of non-informative features to zero, thus removing their impact while retaining the informative features in their original form. This makes L1 regularization particularly useful for identifying and eliminating non-informative features in a model with a high number of input features. PCA, on the other hand, transforms the features and does not keep them in their original form. Shapley values and iterative dropout techniques are not as efficient or practical for feature selection in this context.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
You are developing a machine learning model based on linear regression with over 100 input features. All feature values range between -1 and 1. You hypothesize that many of these features do not contribute meaningful information to the model's predictions. To streamline the model, you wish to remove these non-informative features while retaining the important ones in their original form. Which technique should you use?
A
Use principal component analysis (PCA) to eliminate the least informative features.
B
Use L1 regularization to reduce the coefficients of uninformative features to 0.
C
After building your model, use Shapley values to determine which features are the most informative.
D
Use an iterative dropout technique to identify which features do not degrade the model when removed.
No comments yet.