Standard Deviation expressed as %

I am looking over some Morningstar material and they are using % as a proxy for risk. I assume they are talking about % Standard Deviation… can you explain the calculation for converting standard deviation to % terms, comment on why Morningstar may be using % instead of regular standard deviation, and highlight some of the pitfalls I will want to avoid when using percentage to compare the risk/return % of two portfolios. your time is appreciated - thank you for the help

Standard deviation is in the same units as the mean. So standard deviation is the standard deviation of returns, which are expressed in percent. What else would you want it to be? Certainly not price, because the price - in and of itself - tells you very little (except possibly in fixed income; I’d have to think about htat).

Perhaps you care to explain the following if you have a chance… return 12%, risk 4%. How would one interpret that?

To convert SD expressed in decimal form to % form, you can:

  • multiply the decimal by 100,
  • divide the decimal form by .01, or
  • move the decimal 2 places to the right and add a % sign to the end.

I usually choose the third option.

EG - if SD in decimal form is .159, if you move the decimal two places to the right and add a % sign to the end, you get 15.9%.

This is one of the “hidden” Level 1 LOS’s.

Perhaps you care to explain the following if you have a chance… return 12%, risk 4%. How would one interpret that?

found an article, thanks for the feedback. For some reason I was making more of this than I should have…

http://voices.yahoo.com/using-standard-deviation-risk-measurement-better-3779326.html

cheers

Mean return is 12%

SD of returns is 4%

95% of the time the returns fall in between 12% +/- (2)(4%). In other words, between 4% and 20%.

If the SD were only 3%, then the returns would fall between 6% and 18% about 95% of the time. If the average return were still 12%, this would give you a better return for the amount of risk you were taking because the chance of making 4% or less is substantially lower.

It assumes that returns are normally distributed, and usually they aren’t. But it’s still a reasonable amount of information communicated in just two numbers.

perfectly clear answer - thank you. Seems like a stretch to simplify all of that and simply call it “risk”, but they clearly know more than I. Much appreciated.

If you look at it as a measure of the chance of losing money (or making less than you expect), then it makes some sense as a measure of risk. It’s an incomplete measure of risk, but it is not an unreasonable one, as long as you are aware of its limitations.

Using SD as a risk measure implicitly assumes that all things that cause risk in the present have appeared in the historical record enough to show up in the distribution of returns. That’s obviously not true, becasue some things are very infrequent, and because rules change and societies evolve responses to certain risks, but it is true that a lot of the most common things that present risk have usually shown up in the distribution. That’s why SD is an incomplete measure, but not a completely useless one.