Final answer:
The average speed of the runner who traveled 7.7 miles in an hour and a half is calculated by dividing total distance by total time, which results in 5.13 mph. The closest answer is A) 5.1 mph.
Step-by-step explanation:
The student asked: A runner traveled 7.7 miles in an hour and a half. What was their average speed? To find the average speed, you need to divide the total distance traveled by the total time taken. In this case:
- Total distance = 7.7 miles
- Total time = 1.5 hours (1 hour and 30 minutes)
Therefore, the runner's average speed would be:
Average speed = Total distance ÷ Total time
Average speed = 7.7 miles ÷ 1.5 hours = 5.13 mph
So the correct answer is A) 5.1 mph, which is closest to our calculation.