128k views
1 vote
A runner traveled 7.7 miles in an hour and a half. What was their average speed?

A) 5.1 mph
B) 10.3 mph
C) 7.7 mph
D) 11.3 mph

User HTU
by
8.0k points

1 Answer

4 votes

Final answer:

The average speed of the runner who traveled 7.7 miles in an hour and a half is calculated by dividing total distance by total time, which results in 5.13 mph. The closest answer is A) 5.1 mph.

Step-by-step explanation:

The student asked: A runner traveled 7.7 miles in an hour and a half. What was their average speed? To find the average speed, you need to divide the total distance traveled by the total time taken. In this case:

  • Total distance = 7.7 miles
  • Total time = 1.5 hours (1 hour and 30 minutes)

Therefore, the runner's average speed would be:

Average speed = Total distance ÷ Total time

Average speed = 7.7 miles ÷ 1.5 hours = 5.13 mph

So the correct answer is A) 5.1 mph, which is closest to our calculation.

User DiscoInfiltrator
by
7.8k points