Can some explain why increase in Expected return would give a lower VAR as the answer key on page A-7 sugguested? VAR = expected returns - 1.65*(standard deviation) If expected returns increase, shouldn’t VAR increase??? The answer key stated the opposite.

VAR measure the downside part of the normal curve. so if you increase the expected return then the bottom 5% of the normal curve rises, thus the VAR decrases. For example. e(x) = 10% with a deviation of 10%. a 5% var = -6.5%…(that is, 5% of the time you’re expected to return less than -6.5%) ex = 20% with a deviation of 10%. A 5% var = 3.5% (that is, 5% of the time you’ve expected to return less than 3.5%)… so your Value at Risk decreases as expected return increases… this help?

I still didn’t get this. Let’s take another example - Case 1 : e(x) = 10% with a deviation of 5%. a 5% var = 1.75%…(that is, 5% of the time you’re expected to return less than 1.75%) Case 2 : ex = 20% with a deviation of 5%. A 5% var = 11.75% (that is, 5% of the time you’ve expected to return less than 11.75%)… If we assume a portfolio of 1 million, then the $VARs would be Case 1: $17500 Case 2: $117500 Did the Var not increase? Am I making some silly mistake?

CareerChange…the example you gave is not exactly a “good” VAR example. Using your figures, those portfolio will never (mathmatically) expect a loss at 95% of the time. Case (1), E® range is from +1.75 to +18.25 (10+/-1.65*5)…no negative return. Case (2), E® range is from +11.75 to +28.25 (20+/-1.65*5)…no negative return. Bottom line, given a 5% VAR (95% significant), the examples you used will not experience a loss 95% of the time. Therefore, in those case, VAR$ is meaningless. Does this make sense? In general, if a portfoilo has loss in its range of returns, higher return will give lower $VAR.

Just to add…in your example case (1) $17500 is the 5% min. gain. case (2) $117500 is the 5% min. gain. Gain is good, right?