102k views
1 vote
Vance recently ran a marathon. It took him 4.8 hours to run the 26.2 miles required to complete a marathon.

Find his average speed in miles per hour. Then convert his speed to feet per second. (Hint: 1 mile = 5,280 feet).
(Round your answers to two decimal places)
miles per hour
feet per second

User MHolzmayr
by
8.1k points

1 Answer

4 votes

Answer:

8.01 ft/sec

Explanation:

Distance = 26.2 miles

Time taken to run the distance = 4.8hrs

Average speed = ??

Average speed = distance / time

= 26.2miles /4.8hr

= 5.46 miles/hr

Convert 5.46 miles/hr to ft/sec

1 mile = 5280 ft

We have 5.46*5280 = 28828.8 ft/hr

28828.8/3600 = 8.01 ft/sec

User Twj
by
8.5k points