Efficiency Consistency of Regression estimators

How does heteroskedasticity,serial correlation and multicollinearity affect consistency and efficiency of regression estimators??? Can some1 help me understand this bit

Heteroskedasticity- OLS estimators are still consistent and unbiased; OLS estimators are no longer the best we can do in terms of efficiency (i.e. they are inefficient)

Serial correlation- OLS estimators will become biased if the regressors are correlated with the error term (violation of exogeneity), but not necessarily inconsistent. If serial correlation is the only assumption violated, our estimators will still be unbiased and consistent. In general, the standard errors for the estimated coefficients are biased (they don’t account for the serial correlation), and the standard errors for the estimated coefficients are inefficient.

Multicollinearity- no effect on the unbiasedness or consistency of OLS estimators. The estimated coefficients can be estimated imprecisely and unreliably (you should be hesitant to interpret the size and direction of the coefficient), but unbiased and consistent estimates are still obtained. The estimators are still the most efficient, despite the possibility of inflated standard errors for the estimated coefficients.

So as I understand, if the error terms are correlated with independent variables, or if multicollinearity exists, the coefficients of estimates will be affected

What do you mean by affected?

Multicollinearity will make it harder to estimate the coefficients. For example, the sign of the coefficients and the magnitude could be different than expected (theory says both should be large and positive, but your estimate is small/pos for 1 and large/negative for the other). The standard errors on the pertinent coefficients will also be inflated.

If the error term is correlated with any of the independent variables, then you’ve violated the assumption that says the expected value of the error term is zero, conditional on any settings of the x variables (zero conditional mean; E(e|X) = 0). This is a necessary assumption for the estimates of OLS to be unbiased and consistent.

An important difference is that multicollinearity still allows for unbiased and consistent estimation, because it doesn’t violate any of the regression assumptions (barring perfect collinearity, but then, you wouldn’t be able to estimate the model). You just need to be careful when trying to make statements about partial effects based on the estimated coefficients.

tickersu, I hope you continue to participate in the L2 forum, especially quant, this coming year. Your thoughtful answers are extremely appreciated and never go unnoticed. Thanks from all.

I appreciate the kind words, and I will do my best to stay involved! Good luck studying yes