Consequences of Heteroskedasticity

Related to section 4.1.1 of CFA Text

What are the consequences when the assumption of constant error variance is violated?

Although heteroskedasticity does not affect the consistency of the regression parameter estimators, it can lead to mistakes in inference.

Can some body explain following in simple language:

  1. Regression parameter estimators and

  2. Inference

They’re saying that the values you calculate for the slopes and intercept are likely to be good values (i.e., unbiased), but that the standard errors that you calculate are likely not to be good values, so your inferences (e.g., confidence intervals) are likely to be off.

S2000magician kindly confirm

regression parameter estimators = slopes and intercept ?

Yup.