
Answer-first summary for fast verification
Answer: Model 2 must have a lower value of adjusted R-squared.
## Explanation When two regression models have identical R-squared values, it indicates that the additional variable in the second model ($x_3$) does **not** contribute to the explanatory power of the model for the dependent variable $y$. ### Key Points: - **R-squared** represents the proportion of variance in the dependent variable explained by the independent variables - **Identical R-squared values** mean $x_3$ provides no additional explanatory power - **Adjusted R-squared** penalizes models for including unnecessary predictors - Since $x_3$ doesn't improve model fit, the adjusted R-squared for Model 2 will be **lower** than Model 1 ### Why Other Options Are Incorrect: - **B**: Adjusted R-squared only increases if new predictors improve model fit more than expected by chance - **C**: Adjusted R-squared accounts for number of predictors, so identical R-squared doesn't imply identical adjusted R-squared - **D**: If $x_3$ were statistically significant, it would improve R-squared, which contradicts the given condition ### Mathematical Insight: The adjusted R-squared formula: $$R^2_{adj} = 1 - \frac{(1-R^2)(n-1)}{n-k-1}$$ where $n$ is sample size and $k$ is number of predictors. Since Model 2 has more predictors ($k=2$) than Model 1 ($k=1$) with the same $R^2$, the denominator increases, making $R^2_{adj}$ smaller for Model 2.
Author: Tanishq Prabhu
Ultimate access to all questions.
No comments yet.
Consider the following 2 regression models:
Model 1:
Model 2:
A researcher determines that the two models have identical R-squared values. This most likely implies that:
A
Model 2 must have a lower value of adjusted R-squared.
B
Model 2 must have a higher value of adjusted R-squared.
C
Model 1 and 2 will also have identical values of adjusted R-squared.
D
Variable is statistically significant.