106k views
4 votes
Roger ran 3.2 miles the first day of the fundraiser, 4.0 miles the second day, and 5.1 the last day. If he earned $0.15 per foot for charity, how much did he earn?

User AndreasW
by
8.9k points

1 Answer

4 votes
3.2 + 4 + 5.1 = 12.3, we then divide the total amount of miles ran by the cost per mile, 0.15. We divide because it's a decimal. The answer is 82 dollars.
User ChenZ
by
8.5k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories