
Ultimate access to all questions.
Under multiple linear regression, if an estimator is said to be consistent, what does this imply?
Explanation:
In multiple linear regression, when an estimator is described as consistent, it means that:
A: This describes unbiasedness, not consistency. An unbiased estimator has an expected value equal to the true parameter, but this can occur at any sample size.
B: Consistency doesn't guarantee closeness "regardless of sample size" - it specifically depends on sample size increasing.
D: Consistency doesn't automatically imply unbiasedness or efficiency. An estimator can be consistent but biased in finite samples, and efficiency is a separate property related to minimum variance.
Consistency is about asymptotic behavior - as n → ∞, the estimator converges in probability to the true parameter value. This is a fundamental property for reliable statistical inference in large samples.