95.3k views
5 votes
A person invest $10,000 into a bank the bank pays 4.75% interest compounded semi annually. To the nearest 10th of a year, how long was the person leave the money in the bank until it reaches $19,200 dollars

User Wowzer
by
8.2k points

1 Answer

3 votes

Answer:

T is 13.9 years to the nearest 10th of a year

Explanation:

In this question, we are to calculate the number of years at which someone who invests a particular amount will have a particular amount based on compound interest.

To calculate the number of years, what we do is to use the compound interest formula.

Mathematically,

A = P(1+ r/n) ^nt

Where A is the final amount after compounding all interests which is $19,200 according to the question

P is the initial amount invested which is $10,000 according to the question

r is the rate which is 4.75% according to the question = 4.75/100 = 0.0475

n is the number of times per year in which interest is compounded. This is 2 as interest is compounded semi-annually

t= ?

we plug these values;

19200 = 10,000(1+0.0475/2)^2t

divide through by 10,000

1.92 = (1+0.02375)^2t

1.92 = (1.02375)^2t

We find the log of both sides

log 1.92 = log [(1.02375)^2t)

log 1.92 = 2tlog 1.02375

2t = log 1.92/log 1.02375

2t = 27.79

t = 27.79/2

t = 13.89 years

The question asks to give answer to the nearest tenth of a year and thus t = 13.9 years

User Royal
by
8.0k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories