Multicollinearity

Multicollinearity is correlation among independent variables and can be detected by insignificant t stats and significant f values.

My question is:

if there t stat for just one variable is significant and others it is not, still there is a problem of multicollinearity or not?

can anyone help. Thank you!!

I dont think so, because the multicollinearity is detected whenever none of the independent variable is significant, F is high, R2 is high

Thank You!!!.. i thought so too…

This is not correct. The CFA does a really poor job with the topic. Multicollinearity is not black or white, but rather is a spectrum. It is typically referenced when it has become problematic for the particular situation. Multicollinearity might not be an issue when none of the independent variables are significant despite a significant F-test, but multicollinearity might still be an issue with some independent variables remaining significant. In fact, the individual t-test will only be affected for the variables that are collinear. For example, if X1 and X2 are related to one another but unrelated to X3, the t-test for X3’s coefficient will be unaffected by the collinearity of X1 and X2.

There might be a problem with multicollinearity, but the CFA Institute does an abhorrent job are giving you the tools to determine this.

If I am not wrong, I remember an exercise where there was only one variable with a insignificant t-stat and there was multicollinearity…

Sorry, but it can not occur. The multicollinearity happens only if you have at least two independent variables.

exactly we are speaking about multiple regression so independent variables >= 2. I was saying, that I remember an exercise where, of the independent variables only one had a insignificant t-test and it was stated there is multicollinearity.

Yes, this is written by Schweser (and maybe Curriculum) and what I must memorize for the exam.

But I agree with tickersu, CFAI make a poor job. His example can illustrate a situation where multicollinearity occurs but the t-test of an independant variable is still significant. And that may be the excercise that taytus met

IMO, the sole mathematical reason of multicollinarity is the determinant of the matrix X (you find it in the link wikipedia below) is nearly ZERO (which causes the matrix X is nearly invertible).

https://en.wikipedia.org/wiki/Multicollinearity#Definition

I’m not good enough in math so I’m waiting for other comments from you.

It’s tricky. What if you have an insignificant t-test, significant F-test, yet low R squared?

Bingo.

MC can still be an issue. Think of these as road signs that indicate possible danger ahead, but they don’t actually tell you what the driving is like. MC effects can vary and need a diagnostic work up more than described in the book. They don’t use very good/efficient techniques in the CFAI books.

Mr RS wrote:

It’s tricky. What if you have an insignificant t-test, significant F-test, yet low R squared?

MC can still be an issue. Think of these as road signs that indicate possible danger ahead, but they don’t actually tell you what the driving is like. MC effects can vary and need a diagnostic work up more than described in the book. They don’t use very good/efficient techniques in the CFAI books.

** Right. CFAI material is frequently an oversimplification. Also, they like to leave relevant information out of questions and make you assume what that information would be or that there’s no indication of a problem in that area (they especially like to do this in ethics). It can be maddening if you really know the topic or make the wrong assumption about the missing information (which might be ‘obvious’ in hindsight, but not ex-ante). Undoubtedly, we all will miss a few questions because of this, but unless you a generally bad guesser or run a string of bad luck, it should even out.

In terms of training for being an analyst, it’s probably a good skill to develop. You never have complete information to analyze an investment, right? You have to make guesses/estimates, put your ‘money’ down on your best one and then tally up the score afterwards. And, critically, learn from your mistakes, but not perseverate over them. It’s also why you never get all your bets right and need to learn to diversify and manage risk.

Quant chat is lit.

That would account for the _ multi _ in _ multi collinearity_.

(Although, truth be told, it would also account for the _ co _. In fact – as I’m sure all of you already know – multicollinearity is a pleonasm.)

High F-test with low T-stats is merely a warning sign of potential MC, not a definitive “test” one way or the other. The “rule of thumb” is considered to be any two independent variables having a correlation > .7 (or < -.7 obv).

I know why you are saying it’s tricky… I understand the situation you are describing… But alternatively, what could this situation indicate? conditional heteroskedacity? :slight_smile:

That’s the thing about multicollinearity. It really is not that straightforward.

You’re right that it isn’t very straightforward, but it’s a lot easier when you learn about it from a real statistics reference/course. The CFAI institute creates the train of thought where multicollinearity=bad, but this isn’t necessarily the case. In fact, it’s not even unusual for MC to be (relatively) a non-issue. There are many more methods for handling multicollinearity than the CFAI describes. The methods I’m referring to aren’t necessarily fancy, nor does one work in each scenario. Removing one or more of the collinear variables is very simple, but we can do better in many cases, assuming that something actually does need to be fixed.

Tickersu, out of curiosity, what’s your background?