103k views
5 votes
A person invests 5500 dollars in a bank. The bank pays 4.5% interest compounded

annually. To the nearest tenth of a year, how long must the person leave the money
in the bank until it reaches 6700 dollars?

A person invests 5500 dollars in a bank. The bank pays 4.5% interest compounded annually-example-1

1 Answer

0 votes

Answer:

Explanation:

We can use the formula for compound interest:

A = P(1 + r/n)^(nt)

where A is the final amount, P is the principal (initial amount), r is the annual interest rate (as a decimal), n is the number of times the interest is compounded per year, and t is the time (in years).

In this case, we know that P = $5500, r = 4.5% = 0.045, and we want to find t when A = $6700. We also know that the interest is compounded annually, so n = 1.

Substituting these values into the formula, we get:

$6700 = $5500(1 + 0.045/1)^(1t)

Dividing both sides by $5500, we get:

1.218181818 = (1.045)^t

Taking the natural logarithm of both sides, we get:

ln(1.218181818) = ln(1.045)^t

Using the property of logarithms that ln(a^b) = b ln(a), we can rewrite the right side as:

ln(1.218181818) = t ln(1.045)

Dividing both sides by ln(1.045), we get:

t = ln(1.218181818)/ln(1.045) ≈ 4.2

Therefore, the person must leave the money in the bank for about 4.2 years to reach $6700. To the nearest tenth of a year, the answer is 4.2 years.

User Sgaw
by
8.0k points