Does anyone know why this tends to be the case in regression models that have multicollinearity?

tickersu does.

He’ll likely chime in here.

Standard errors represent uncertainty surrounding the estimate related to that standard error. Estimates that are consistent have less uncertainty when we have more information. In the case of a regression coefficient, the more we know about the relationship of Y with X_{1}, the smaller that standard error will be for the beta 1 hat coefficient. If X_{1} and X_{2} are related, there is less unique information we know about each with their respective relationships with Y. It becomes harder to understand how X_{1} or X_{2} relates to Y, so we have less certainty in beta 1 hat or beta 2 hat. That’s the detailed explanation.

The simpler explanation comes from the formula for the standard error for beta i hat.

Suppose we have X_{1} and X_{2} to predict Y. The standard error for beta i hat where i=1 or 2 is SS(error)/[(SSx_{i})*(1-R^{2}_{aux})]^{0.5}

Where SS(error) is sum of squared errors of prediction, SSx_{i} is the sum of squared deviations for X_{i} about its mean, and where R^{2}_{aux }is the R^{2} for predicting X_{1} with X_{2} (and any other X variables if there are more than 2 involved) including an intercept. This tells us the proportion of variation in X_{1} explained (shared) by X_{2}. Subtracting this from 1 gives us the unique (unshared) variation in X_{1}. We make the denominator of the standard error smaller, so the quotient gets bigger.

Another point to note is that 1/(1-R^{2}_{aux}) is called the Variance Inflation Factor (VIF). It tells us the factor by which the variance of beta i hat is inflated due to X_{i}’s relationship with the other independent variables in the model. The square root VIF tells us how much the standard error of beta i is inflated.

How can you tell the difference between a trend model and AR model just by looking at a regression.

Look at the independent variable(s):

- If it’s
*t*, it’s a trend model - If it’s (they’re) the same as the dependent variable, it’s an autoregressive model
- If it’s (they’re) something else, it’s something else

Helpful or nah? I can try again if not, or S2k can stab at it!