215k views
4 votes
A typical marathon is approximately 42.3 kilometers. Marty averages 8 miles per hour when running in marathons. Determine how long it would take Marty to complete a marathon, to the nearest tenth of an hour. Justify your answer.

User Iphaaw
by
8.0k points

1 Answer

3 votes

Final answer:

To calculate Marty's marathon time, we convert the marathon distance to miles and then use the formula time = distance/speed, resulting in an approximate completion time of 3.3 hours.

Step-by-step explanation:

To determine how long it would take Marty to complete a marathon, we first need to convert the marathon distance from kilometers to miles, since Marty's speed is given in miles per hour. We know that 1 kilometer is approximately equal to 0.621371 miles. Therefore, we can calculate the distance of a marathon in miles:

42.3 kilometers × 0.621371 miles/kilometer = 26.30 miles (rounded to two decimal places).

Next, we use the formula time = distance/speed to calculate the time it would take Marty to run the marathon:

Time = 26.30 miles ÷ 8 miles/hour = 3.2875 hours.

Finally, we round this to the nearest tenth of an hour:

Time ≈ 3.3 hours.

So, it would take Marty approximately 3.3 hours to complete a marathon.

User Kapreski
by
8.3k points