201k views
5 votes
Sales of a certain product are declining at a rate proportional to the amount of sales. If at the end of the first year the sales have declined by 22%, then how many years will have passed (since the beginning of the first year) when sales become only 31% of their original value? Express your answer as a decimal, correct to within 0.001 years.

User GMAC
by
8.1k points

1 Answer

4 votes

Answer:

The answer is "6.093 years".

Explanation:

The rate of decline in sales in
22\% per year.

The starting sales is 100 units:

Using compounding formula:


\to 100 * (1-(22)/(100))^t=22\% \ of \ 100\\\\\to 100 * ((100-22)/(100))^t=(22)/(100) * \ 100\\\\\to ((78)/(100))^t=(22)/(100)\\\\\to 0.78^t=0.22\\\\\text{taking \log on both the sides}\\\\

taking log on both sides


\to \log_e \ 0.78^t= \log_e\ 0.22\\\\\to t \log_e \ 0.78= \log_e\ 0.22\\\\\to t = (\log_e 0.22)/(\log_e 0.78)\\\\


= (-0.6575)/(-0.1079)\\\\= (0.6575)/(0.1079)\\\\=6.093

User Kevin Chen
by
8.1k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories