7.5k views
2 votes
In 2018, the winner of the Chicago Marathon ran 26.2 miles in 2 hours, 5 minutes, and 11 seconds. What was his average speed in miles per hour?

User Morgs
by
8.1k points

1 Answer

0 votes

Answer:

The winner ran at an average of 12.39 miles per hour.

Explanation:

In order to find this, we first need to convert the minutes and second into hours. This way we can add them to the 2 hours and then solve. We do this by dividing minutes by 60 and seconds by 360 since those are the unit rates.

5 mins/60 = .083 hours

11 secs/360 = .031 hours

2 + .083 + .031 = 2.114 hours

Now we take our 2.114 hours and divide the miles by that number to get the miles per hour.

26.2 miles / 2.114 hours = 12.39 mph

User DJJ
by
7.0k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories