Heteroskedasticity and Regression Coefficient


Can someone explain the apparent anomaly in the following,

  1. Heteroskedasticity does not affect the consistecy of Regression Coefficient, also Heteroskedasticity does not make the estimator unbiased

  2. Heteroskedasticity causes t-tests of each Regression Coefficient to be unreliable as the standard errors of Regression Coefficient become biased

How come the estimators remain BLUE as well as unreliable??? Thanks in advance.



While Heteroskedasticity does not introduce bias into the ols estimates of regression coefficients it can cause ols estimates of the variance of the coefficients to be biased. Since test statistics formulas usually involve some measure of the variance of such estimates (e.g.T=(b0-B0)/(s/SQRT(n) with s being the square root of the variance of the coefficient estimate),inferences may be wrong. Basically : Heteroscedasticity->bias in variance of estimates->bias in standard error->incorrect test statistics->possibilty of incorrect inferences due to rejecting hypotheses when they are in fact true or failing to reject hypotheses when they are in fact false. BLUE stands for Best Linear Unbiased Estimator.Under heteroskedasticity, OLS estimates are still LINEAR and UNBIASED but are no longer BEST because among unbiased estimators,OLS does not provide the estimate with the smallest variance anymore. ,i hope this is right…(sits on magic carpet and hovers back into unemployment)

Thanks Alladin, That makes sense, but why Volatility of Volatility (meaning variance in Regression Std Error) causes distorts the Std. Error of Slope Coefficient alone? Pls. remember, correction of Heteroskedasticity increases/decreases the Std. Error of Slope Coefficient and not Intercept Coefficient, why?