Am I correct in assuming… Conditional heteroskeasticity/autorcorrelation/multicollinearity -CAN trust the regression coefficients -CANNOT trust any predictions made with the model (i.e. heteroskedasticity produces SE that’s too low, so t-stats are too high) Misspecification -CANNOT trust either slope coefficient values or predictions made with model Am I missing another other big differences? Thanks
Serial Correlation: Where the error terms are correlated (also called autocorrelation). The most common effect is that the estimate of the standard errors of the slope coefficients may be incorrect. A Durbin-Watson statistic is used to test for serial correlation. Multicollinearity: Causes the R-squared and occurs when two or more of the independent variables are highly correlated. CH: Causes the standard errer to be underestimated, causing deceptively high t-scores. This can cause an insignificant slope coefficient to appear significant. I don’t understand what you mean but you can trust the regression coefficents but you can’t trust any predictions from the model? Those statements seem to be contradicting each other as the regression coefficients basically are the model.
sorry I meant to say causes the R-squared to be overstated, for Multicollinearity
“…I don’t understand what you mean but you can trust the regression coefficents but you can’t trust any predictions from the model?” I think this is what muffin09 means by the above When Multi coll , CH or auto corr occur, the coeff calculated is fine. however the standard deviation is not correct. Hence the t-stat is incorrect because of which the significance tests are incorrect.