QBank 86547 - Multicollineaity Impact

Which of the following problems, multicollinearity and/or serial correlation, can bias the estimates of the slope coefficients? A) Multicollinearity, but not serial correlation. B) Serial correlation, but not multicollinearity. C) Both multicollinearity and serial correlation. According to CFAI V1 p.315, “muticollinearity does not affect the consistency fo the OLS estimates of the reg coeff, the estimates become extremely imprecise and unreliable.” So when comes to the exam, should we consider multicollineaity “bias” / “affect consistency” of reg coefficients?

>should we consider multicollineaity “bias” / “affect consistency” of reg coefficients? Yes

swaptiongamma Wrote: ------------------------------------------------------- > >should we consider multicollineaity “bias” / > “affect consistency” of reg coefficients? > > Yes No. Biasedness and inconsistency are two completely different characteristics of an estimator. Serial correlation produces neither a bias, nor inconsistency. It does however render the estimators inefficient, and therefore bias the std. errors down, t-stats biased up. Multicollinearity does not create a bias, but it render OLS unable to fully distinguish the individual effects of your ind. variables and creates an upward bias on the std. errors. This is not the same as a biased estimator.

wyantjs, to follow up, couldn’t serial correlation also bias std. errors up, causing t starts to be too low as in the case with negative serial correlation?

Ali, wot u said should be exactly opposite. Serial correlation – > [lower] standard errors --> High t-stats --> A variable that is not significant appears significant. Multicollinearity --> [Higher] Standard errors --> low t- stats – > A variable that is really significant appears Insignificant. Wot wyantjs said is very accurate and conscise description. Since the independent variables are correlated OLS is not able to distinguish impact of individual variables on the dependent variable when multicollinearity is present.

Right or wrong: Serial Co-relation (+ve) -----> lead to type I error Serial Co-relation (-ve) ------> leads to type II error Multi-collinearity -----> leads to type II error

TheAliMan Wrote: ------------------------------------------------------- > wyantjs, to follow up, couldn’t serial correlation > also bias std. errors up, causing t starts to be > too low as in the case with negative serial > correlation? This would seem logical, but it isn’t the case. A little math (result of a geometric series) will show you that the variance of the error term will involve the square of the correlation coefficient. What you get is: true var = observed var/(1-rho^2). Since rho is must be less than one to avoid an explosive series, we see that the true var is always greater than the observed. The closer rho is to one, the more the bias. The closer rho is to zero, the lower the bias. Rho =1 gives an undefined variance, and thus we have a problem. Therefore, the observed must be biased downward. Intuitively…the correlation among error terms masks the volatility of the true underlying process. The model doesn’t pick up on the true variance as it is being clouded by correlation among successive errors.

Both do not bias the slope coefficient. Slope coefficient will be biased only if the zero conditional mean assumption is violated.

Thanks all for your responses here. The answer from Schwser is “(A) Multicollinearity, but no serial correlation” and this is the reason I start a post here. Detailed explanation: “Multicollinearity can bias the cofficients because the shared movement of the independent variables. Serial correlation biases the standard errors of the slope coefficients.” My understand on this (both do not bias) leads me to “no choice” in this question. My little summay on this is: Stat Inference Reg Coeff Coeff S.E. heteroskedasticity (cond) unreliable consistent Type I +ve serial correlation unreliable inconsis (if has lagged) Type I multicollinearity unreliable consistent Type II Any comments are welcome :slight_smile:

Serial Correlation = bad t stats, and bad hypothesis testing. SC effects the values of the dependent variable (Y) only. SC will not effect the independent variables (X’s), the independent variable’s effects on the dependent variable, and therefore no effect on the slopes (B’s). Y = A + B1 * X1 + B2 * X2 + e SC in e may lead to bad Y est, but has no effect on B’s & X’s. X’s effect on Y is (essentially) the same no matter how bad the error screws up the dependent variable. Multicollinearity menas the independent variable are correlated (X’s are correlated). When the X’s are correlated, you cannot be sure if B1 and B2 are correct. Assume we are trying to see how ROE and ROA effect a share’s price. ROE & ROA are going to be correlated since we are looking at net income in both independent variables. This regression may show B1 = 5 for ROE and B2 = 10 for ROA, but these values may be completely wrong. Price might have little relation to ROA but a lot with ROE, but since ROA + ROA have multicolinearity due to the net income component, we cannot what the true slopes should be. So: SC = Effects t stats & hypothesis testing. Multi = bad betas.