74.4k views
4 votes
A runner completes a 26 mile marathon in 3 hours and 40 mins. Find the average speed in meters per second.

1 Answer

3 votes
To find the average speed in meters per second, you'll need to convert the marathon distance from miles to meters and the time from hours and minutes to seconds.

1 mile = 1,609.34 meters (approximately)
1 hour = 3,600 seconds
1 minute = 60 seconds

So, the marathon distance in meters is:
26 miles * 1,609.34 meters/mile = 41,841.64 meters

The time in seconds is:
3 hours * 3,600 seconds/hour + 40 minutes * 60 seconds/minute = 13,200 seconds

Now, you can calculate the average speed:

Average Speed (meters per second) = Distance (meters) / Time (seconds)
Average Speed = 41,841.64 meters / 13,200 seconds ≈ 3.17 meters per second

The average speed of the runner in meters per second is approximately 3.17 m/s.
User Patrick Jordan
by
8.0k points

No related questions found