Final answer:
The average speed of the runner is calculated by converting the marathon distance to meters and the running time to seconds, then dividing the distance by the time. The average speed is approximately 3.95 m/s.
Step-by-step explanation:
To calculate the average speed of the marathon runner in meters per second (m/s), we first need to convert the total running time from hours and minutes to seconds. The runner completed the marathon in 2 hours and 57 minutes, which is equal to (2 × 60 + 57) minutes or (120 + 57) minutes, which is 177 minutes. To convert this to seconds, we multiply by 60, giving us 177 × 60 = 10620 seconds.
Next, we convert the marathon distance from kilometers to meters. Since 1 km is equivalent to 1000 m, the distance in meters is 42.0 km × 1000 m/km = 42000 m.
The average speed is then calculated by dividing the total distance by the total time: Average speed = Total distance / Total time = 42000 m / 10620 s ≈ 3.95 m/s.
Therefore, the correct answer to the student's question is: