109k views
5 votes
A person invests 9500 dollars in a bank. The bank pays 5.75% interest compounded daily. To the nearest tenth of a year, how long must the person leave the money in the bank until it reaches 26100 dollars? A= P(1+! nt

User Mike Kaply
by
7.2k points

1 Answer

6 votes

1. Data input

P = 9500 dollars

r = 5.75% daily

A = 26100 dollars

n = 365

2. Equation


A=P(1+(r)/(n))^(nt)
\begin{gathered} 26100=9500(1+(0.0575)/(365))^(365t) \\ (26100)/(9500)=(1+0.00157)^(365t) \end{gathered}


t=17.6\text{ years}

User Jadengeller
by
7.8k points