I am no expert in quants, far from it really, but i am sure multicollinearity leads to unreliable coefficient estimates which are consistent. As for serial correlation and heteroskedasticity, i think coefficient estimates are totally unaffected-consistent and reliable.
“Even though multicollinearity does not affect the consistency of slope coefficients, such coefficients themselves tend to be unreliable.”
“The most common way to detect multicollinearity is the situation where t-tests indicate that none of the individual coefficients is significantly different than zero, while the F-test is statistically significant and the R2 is high.”
Regarding multicollinearity: My understanding of the entire regression thing in CFA is that it is not only about making a precise prediction but also understand the factors that lead to this prediction. Take a simple regression
y= 0 + 2*risk_1 + 1*risk_2
So, with risk_1 = 1, risk_2 = 3 you get y=5. This will also hold with multicollinearity. But if you are interested in the relative contribution of risk_1 to your output you will run into problems referring to the given b_1 = 2. Normally you could assume, holding everything else constant (e.g. controlling for the other independent parameter) risk_1 influences the output by a factor / sensitivity of 2. But since multicollinearity is involved, this factor itself is messed up and part of its influence will also been included in the risk_2 factor. Thus, the pure influence will probably be something else. Now think of the Fama French Model and the consequences for our interpretations of its results.
Only thing I am not so sure of is how the controlling of the other variable comes into play here. From my explanation here, it should be the case that as long as the contolling of all variables is always ensured (also in the interpretation of the model) we should not have any problems (albeit that it might be impractical)