123k views
2 votes
A person jogs 4.0 miles in 31 minutes, then stops for 1 min to tie her shoe and

then finishes the last 2.0 miles in 22 minutes. What is the jogger's average speed
in miles per minute?

User Lukaspp
by
7.8k points

1 Answer

6 votes

Final answer:

The jogger's average speed is 0.1111 miles per minute, calculated by dividing the total distance of 6.0 miles by the total time of 54 minutes.

Step-by-step explanation:

The student asked: What is the jogger's average speed in miles per minute? To calculate the average speed of the jogger, we need to find the total distance and the total time.

The jogger covers a distance of 4.0 miles in 31 minutes and then another 2.0 miles in 22 minutes. Therefore, the total distance is 4.0 miles + 2.0 miles = 6.0 miles. The jogger also takes a 1-minute break. So, the total time spent jogging is 31 minutes + 1 minute + 22 minutes = 54 minutes.

To find the average speed, we divide the total distance by the total time:

6.0 miles / 54 minutes = 0.1111 miles per minute (rounded to four decimal places).

This is the jogger's average speed over the entire distance.

User Jonathan Benn
by
8.3k points