149k views
5 votes
If a marathon runner averages 8.6 mi/h, how long does it take him or her to run a 26.22-mi marathon?

1 Answer

5 votes

Answer:

Time, t = 3.04 hours

Step-by-step explanation:

Given that,

Average speed of the marathon, v = 8.6 mi/h

Distance covered by the marathon runner, d = 26.22 mi

We need to find the time taken by the marathon runner to run that distance. Time taken is given by :


t=(d)/(v)


t=(26.22\ mi)/(8.6\ mi/h)

t = 3.04 hours

So, the time taken by the marathon runner is 3.04 hours. Hence, this is the required solution.

User Ajay Gupta
by
5.5k points