
Ultimate access to all questions.
Answer-first summary for fast verification
Answer: The independent variables are not perfectly multicollinear.
## Explanation In multiple least squares regression, one of the key classical assumptions is that the independent variables are **not perfectly multicollinear**. This means: - No independent variable can be expressed as a perfect linear combination of other independent variables - The design matrix (matrix of independent variables) must have full column rank - This ensures that the regression coefficients can be uniquely estimated **Why the other options are incorrect:** - **A**: Stationarity is not a required assumption for multiple least squares regression. While stationarity is important in time series analysis, cross-sectional regression models don't require stationary dependent variables. - **C**: Heteroskedastic error terms actually violate the classical assumptions of OLS. The correct assumption is that error terms are **homoskedastic** (constant variance). - **D**: Homoskedasticity refers to the error terms, not the independent variables. The assumption is that the **error terms** are homoskedastic, meaning they have constant variance across all observations. **Key OLS Assumptions:** 1. Linearity in parameters 2. Random sampling 3. No perfect multicollinearity 4. Zero conditional mean (E(ε|X) = 0) 5. Homoskedasticity (constant variance of errors) 6. Normality of errors (for inference) Option B correctly identifies one of these fundamental assumptions.
Author: LeetQuiz .
No comments yet.
Which of the following is assumed in the multiple least squares regression model?
A
The dependent variable is stationary.
B
The independent variables are not perfectly multicollinear.
C
The error terms are heteroskedastic.
D
The independent variables are homoskedastic.