CFAI Reading 12 Question 20

This is in regards to determining the correlation from Multiple R, in the prior reading regarding simple regression, it stated taking the sq root of multiple R to arrive at the correlation is only possible in simple regression. How are they applying it to this problem? Any help would be appreciated!

There is a small footnote in the text that mentions that. It is the correlation between the actual values and the forecast values of Y.

I got stuck on this question as well as it does specifically mention that you cannot simply square the correlation coefficient to get the coefficient of determination in a multiple regression, which this clearly is. The only reason I can see for this answer to be correct is b/c it is the MOST correct out of the three choices given - even though it may still be wrong. If anyone has a better answer I would love to know.

For simple regression if you take the square root of R^2, you will get r which I believe is the correlation coefficient between the independent and the dependent variable. For multiple regression, if you take the square root of R^2, you will get Multiple R, which is the correlation between the actual values and the forecast values of the dependent variable. Hope that helps.

Just trying to get some definition clear: R^2 = Explained variation/Total variation, so if R^2 = 36% i.e., 36% of total variation (i.e., sum of ((Y - Y mean)^2) is explained by (read EQUAL) sum ((Y predict - Y mean)^2). R^2 has many names: Coefficient of determination, R-squared or multiple R-squared (for multi variables). R^2 is also the square of R. R is per definition correlation between the predicted and actual values of the dependent variable. Now, for one variable regression R^2 is ALSO = square of r (note: small r, not BIG R) where r is the correlation coefficient between the dependent variable and the ONLY INDEPENDENT variable. Remember r = COV(X, Y)/( stddev X* stddevY). Of course, for multiple variable regression you cannot do that since there are many variables. Per definition, R^2 is also = Explained variation/Total variation for one variable, but the point is that you can do a short cut by calculating r = COV(X, Y)/( stddev X* stddevY) then square it to get to r^2 and thus also R^2, if you have only one-variable regression. In this sense, you can calculate R^2 of a regression without EVEN knowing the regression equation!!! In multi variable, you cannot use this shortcut to calculate R^2, but must do it the long way, i.e., calculate the regression first, then the SST, SSE, RSS before you can calculate the R^2. Getting back to R vs. r. R is the correlation between predicted and actual values of the dependent variable Y. I guess it means R = COV( Y, Ypredict) /(stddev Y * stddev Y predict). For one variable, R is also =r = COV(X, Y)/( stddev X* stddevY). I hope it is clear now, based on the above explanation, that the only correct answer for the question is C.

OK, So… I understand that multiple R is the correlation b/w dependent variable actual value and predicted dependent variable. I also understand that R^2 is multiple R^2. What I’m still confused on ( and this may be gettinn a little too in depth, but if I don’t ask it will continue to haunt me) is that if for a simple regression… R = COV( Y, Ypredict) /(stddev Y * stddev Y predict) and r = COV(X, Y)/( stddev X* stddevY) and R=r Then… COV( Y, Ypredict) /(stddev Y * stddev Y predict) = COV(Y, X)/( stddev Y* stddev X) Therefore Ypredcit = X but how could that be if Ypredict= b0 + b1X + E This is what doesn’t make sense to me except for instances where b0=0 and b1=1 Anyone?

FinNinja no problem for asking. > but how could that be if Ypredict= b0 + b1X + E Nope, Ypredict= b0 + b1X , NOT Ypredict= b0 + b1X + E therefore COV( Y, Ypredict) /(stddev Y * stddev Y predict) = COV( Y, b0 + b1X ) /(stddev Y * stddev (b0 + b1X )) =b1* COV( Y, X ) /(stddev Y * b1*stddev (X )) = COV( Y, X ) /(stddev Y * stddev (X )) Read the link below, if you don’t undertstand the mathematical manipulations I just did. http://en.wikipedia.org/wiki/Covariance http://en.wikipedia.org/wiki/Standard_deviation

“So everything’s just wrapped up in a nice little package then!..What? I really meant that!..sorry if it sounded sarcastic.” -Homer Simpson Thanks.