
Financial Risk Manager Part 1
Get started today
Ultimate access to all questions.
Under multiple linear regression models, there's always the risk of overestimating the impact of additional variables on the explanatory power of the resulting model, which is why most researchers recommend using the adjusted R², , instead of R² itself. This adjusted R²:
Explanation:
The adjusted R-squared, , will never be greater than the regression R-Squared, R². This is because the adjusted R-squared is a modification of the R-squared that adjusts for the number of predictors in the model. The adjusted R-squared increases only if the new variable improves the model more than would be expected by chance. It decreases when the variable improves the model by less than expected by chance. Therefore, the adjusted R-squared is always less than or equal to the R-squared.
Choice A is incorrect. The adjusted R-squared, , is not always positive. It can be negative if the model's explanatory power is poor and the number of predictors (independent variables) in the model is large relative to the number of observations.
Choice C is incorrect. The adjusted R-squared, , can increase when an additional independent variable that significantly improves the model's fit is incorporated into it. However, unlike R², penalizes for adding variables that do not improve the model significantly.
Choice D is incorrect. The adjusted R-squared is not always negative; it can be positive when the model has good explanatory power.