Question -> hetero, serial, multi

An analyst is estimating whether company sales are related to three economic variables. The regression exhibits heteroskedasticity, serial correlation, and multicollinearity. Which of the problems needs to be corrected in order to obtain an accurate estimate of the regression parameters? A) Only serial correlation. B) Only multicollinearity. C) All of them. D) None of them.

A ?

C

B. Multicollinearity is the only one that actually affects the stability of the regression coefficient parameters. Hetero and serial correlation only affect the std errors, which in turn throw off your T - stats.

B - nice explanation JB

Well done.

mcpass was the closest to this one. I did an ‘A’ o this too. Since heteroskedasticity and multicollinearity don’t affect the estimate of the regression parameters, but they surely do distort the standard errors and hence the t-values and hence the inferences we draw from those statistical procedures. So only ‘serial correlation’ should have been the answer. But they say the answer is D and the reason they have for Serial Correlation not affecting the estimates is - “it’s only an issue when one of the independent variables is a lagged value of the dependent variable” So how the heck did they visualize that the regression equation did not have an independent variable that was a lagged version of the dependent variable, without the regression-equation given in the question. I still feel that answer should have been A and not D (because of poor/ inaccurate wording) — Thoughts??

Joey…?

Dinesh… Check your explanation. Multicollinearity does bias the slope coefficents due to the sahres movement of the independent variables. heteroskedasticty and serial correlation only affect std errors, not stability.

Wikipedia says: Multicollinearity does not actually bias results, it just produces large standard errors in the related independent variables. With enough data, these errors will be reduced http://en.wikipedia.org/wiki/Multicollinearity Not saying it is an unimpeachable source, but I don’t see why multicollinearity would bias the coefficients - I see why it would make them unstable - but not why it would make them biased estimators.

If your independent variables are correlated…then the variation in the dependent variable is going to be all screwy. You wont be able to distinguish the individual impacts of each independent variable. It will look great though. You will have a high R^2 and your F test will be off the charts. But you can tell if it is present if these two occur and you have an insignificant T-test. per CFA text "Although the presence of multicollinearity does not affect the consistency of the OLS estimates of the regression coefficients, the estimates becomes extremely imprecise and unrealiable. If I am running a regression of winter sales and my two variables are snowfall and temperature…I have an issue…too much correlation.

In all 3, the parameters are unaffected in a typical scenario. So, it would be D. Edit: The question only talks about slope parameters and not S.E co-efficients. So no adjustment is required.

You’ve just agreed then, that the answer is D: jbisback Wrote: ------------------------------------------------------- > per CFA text "Although ****the presence of > multicollinearity does not affect the consistency > of the OLS estimates of the regression > coefficients****, the estimates becomes extremely > imprecise and unrealiable. So although the estimates will be more variable (ie standard error increased), they are unbiased estimators - so no adjustment required.

mcpass Wrote: ------------------------------------------------------- > Joey…? Was fishing. Unfortunately the fish won. I sat there looking stupid having a battle of wits with a cod fish and lost. I think the problem is “Which of the problems needs to be corrected in order to obtain an accurate estimate of the regression parameters?” The question isn’t really phrased in a way that is answerable. If the only criteria for “accurate” is “consistent”, then I guess I like D because your regression estimates are consistent in the presence of all these things. I wouldn’t like to be in a spot where I am saying my regression estimators are “accurate” (which is not a mathematical property of estimators) in the presence of bias and inefficiency.

Joey… what is your background? How is it that you’re such a quant genious?

I stayed at a Holiday Inn Express.

JoeyD for President…

ha ha!!

Man, I hope we don’t get questions like these. I thought I had a decent handle on quant, but I would have missed this for sure.

so the answer is D huh?