132k views
3 votes
A person invests 10000 dollars in a bank. The bank pays 6.5% interest compounded

annually. To the nearest tenth of a year, how long must the person leave the money
in the bank until it reaches 29500 dollars?

1 Answer

4 votes

Answer:

Explanation:

We can use the formula for compound interest:

A = P(1 + r/n)^(nt)

where A is the final amount, P is the principal (initial amount), r is the annual interest rate (as a decimal), n is the number of times the interest is compounded per year, and t is the time (in years).

In this case, we know that P = $10,000, r = 6.5% = 0.065, and we want to find t when A = $29,500. We also know that the interest is compounded annually, so n = 1.

Substituting these values into the formula, we get:

$29,500 = $10,000(1 + 0.065/1)^(1t)

Dividing both sides by $10,000, we get:

2.95 = (1.065)^t

Taking the natural logarithm of both sides, we get:

ln(2.95) = ln(1.065)^t

Using the property of logarithms that ln(a^b) = b ln(a), we can rewrite the right side as:

ln(2.95) = t ln(1.065)

Dividing both sides by ln(1.065), we get:

t = ln(2.95)/ln(1.065) ≈ 18.2

Therefore, the person must leave the money in the bank for about 18.2 years to reach $29,500. To the nearest tenth of a year, the answer is 18.2 years.

User Matthew Weber
by
8.4k points