The issue of differences in international interest rates is quite relevant and interesting as it determines a large part of capital flows.
Why would we expect the difference in the 1-year interest rate on the dollar vs the 1-year interest rate on, the Euro or any other freely convertible currency, to match exactly the anticipated depreciation/appreciation of the dollar vs the foreign currency over the same one year? In other words, why are all real interest rates in all major currencies exactly equal, adjusted for the anticipated depreciation/appreciation of one currency relative to another one?
This happens because otherwise there is a possiblity of making making profits without incurring any risk due to arbitrage.
Let us take an example. The 1 year interest rate in US is 3% and in Europe it is 4%. There is a differential of 1%. The spot rate between US Dollar and Euro is 1.23 i.e., 1 Euro is worth 1.23 USD. Suppose you borrow $100 in US for 1 year at the prevailing rate of 3%. That at the end of 1 year you have to repay USD 103. You convert $100 to Euro now. ...
The solution explains how forward interest rates are related to interest rate differential