
Answer-first summary for fast verification
Answer: There is no multi-collinearity
The correct answer is D. The multiple regression model assumes that there is no multicollinearity among the independent variables. Multicollinearity occurs when independent variables are highly correlated with each other, which can lead to unreliable coefficient estimates and inflated standard errors. **Explanation of why other options are incorrect:** - **A is incorrect**: Homoskedasticity refers to the constant variance of error terms, not the independent variables. The assumption about independent variables is that they are non-random and fixed in repeated sampling. - **B is incorrect**: Heteroskedasticity (non-constant variance of error terms) is actually a violation of one of the classical linear regression assumptions. The assumption is that error terms are homoskedastic (have constant variance). - **C is incorrect**: While stationarity can be important in time series analysis, it is not a specific assumption of the classical linear regression model. The dependent variable does not need to be 'unique and stationary' as a formal assumption. **Key assumptions of multiple linear regression include:** 1. Linearity in parameters 2. Random sampling 3. No perfect multicollinearity 4. Zero conditional mean of errors 5. Homoskedasticity of errors 6. Normality of errors (for inference) 7. No autocorrelation (for time series data)
Author: Nikitesh Somanthe
Ultimate access to all questions.
No comments yet.
One of the following assumptions is applied in the multiple least squares regression model. Which one?
A
The independent variables included in the model are homoskedastic
B
The residual terms are heteroskedastic
C
The dependent variable is unique and stationary
D
There is no multi-collinearity