Consequences of Regression Assumption Violation

Dear all,

Any one could help me simply explain why the serial correlation leading to the

* In case positive correlation

+) Underestimate Standard Error of Estimates (slope coefficient and intercept)

+) Underestimate MSE

* In case negative correlation

+) Overestimate Standard Error of Estimates (slope coefficient and intercept)

+) Overestimate MSE

It is hard for me to remember without any reasoning for such above consequences

_ If the errors are positively correlated, the true variance of the errors will be larger than the usual formula, since we would need to account for a positive covariance term. The standard errors for coefficients are derived from the model MSE.

Negative correlation necessitates a negative covariance term to truly calculate the variance of the errors. The usual formula omits this due to the independence assumption (same as above question). Therefore, the usual formula will overestimate the MSE, which necessitates overestimation of the coefficient SEs.