# asset allocation: foreign currency

So i know that the standard deviation of portfolio return in domestic currenty is the square root of (SD of porfolio return in fc) square + (SD of foreign exchange rate return) square + 2*correlcation between foreign exchange rate and portfolio return in fc * (SD portofolio return in fc) * (SD of foreign exchange rate).

However, if Rfc is a risk free asset, standard deviation of portflio return in dc = standard deviation of foreign exchange return * (1+Rfc). Why?

Id Rfc is a risk free asset, its SD should be 0. The above formula wil be equal to SD of foreign exchange rate return. Why should we multiply by 1+Rfc?

Note that for the standard deviation of a portfolio’s returns, you’re adding the returns of the individual securities; for the standard deviation in your domestic currency of a portfolio denominated in a foreign currency, you’re _ multiplying _ the portfolio return and the currency return. They’re very different (statistically).

Also note that the curriculum doesn’t address the volatility of returns (in domestic currency) when the investment (in foreign currency) is not risk-free. The reason is that the formula is much too complicated.