Arun is going to invest $7,700 and leave it in an account for 20 years. Assuming theinterest is compounded continuously, what interest rate, to the nearest hundredth ofa percent, would be required in order for Arun to endend up with $13,100?
So first you do 13,100 minus 7700 equals 5400. Now you have 5400 you want to divide it by 20 which equals 270. So over the course of 20 years it went up to 13100. Which means every year it had to go up by $270 but that’s not a percent so… we have to divide 13100 by 5400 which equals 2.43 (Hope this helps)