Question related to regression model

What situation will cause inconsistent and biased estimated coefficient?

What situation will cause unreliable and incorrect estimated coefficient?

What situation will cause incorrect standard errors of estimated coefficient?

What situation will cause both inconsistent and biased estimated coefficient as well as incorrect standard errors of estimated coefficient?

I understand the meaning of “inconsistent” and “biased”. But I do not know what kind of violation will create these things.

Sorry I am confused with this part.

Essentially, you need 4 of the 6 assumptions mentioned in the book for OLS to be unbiased and consistent (you don’t need constant variance or normality for unbiasedness or consistency).

I would suggest flipping through the text to make a table for these. If you’d like, post the table when you’re done and I’m sure people would be happy to give you feed back on it (you likely won’t forget it after doing the legwork).

I am not claiming this to be comprehensive answer but more of a first step towards that goal:

1) uncond. Heteroskedasticity: violation of the assumptions but can be ignored

2) cond. Heteroskedasticity : only affects SEs, which become incorrect. Solution: Robust SEs or GLS

3) Serial Correlation: parameters estimates inconsistent, SEs incorrect (Use robust standard errors (corrected for serial correlation)

4) Multiocollinearity: parameter estimates’ consistency is not affected, but imprecise and unreliable. SEs inflated (individual seem to insignificant but R squared is very high)

5) OMV: parameters are biased and inconsistent, and SEs also inconsistent

Hope this helps!.

I think parameters estimates are consistent under serial correlation unless lag dependent value becomes the independent value.

By the way, what is OMV?

Yes, you are correct about that, thank you. I copied the section in the book (as a reference for myself in a couple of months):

As long as none of the independent variables is a lagged value of the dependent variable (a value of the dependent variable from a previous period), then the estimated parameters themselves will be consistent and need not be adjusted for the effects of serial correlation. If, however, one of the independent variables is a lagged value of the dependent variable—for example, if the T-bill return from the previous month was an independent variable in the Fisher effect regression—then serial correlation in the error term will cause all the parameter estimates from linear regression to be inconsistent and they will not be valid estimates of the true parameters.

OMV is supposed to be Omitted Variable Bias, one of my old textbooks abbreviated it that way, but looking at it now, it does not make much sense.