137k views
4 votes
It took a race car driver 3 hours and 15 minutes to go 500 miles. What was his mean rate of speed, to the nearest tenth of a mile per hour?

User Nektobit
by
7.8k points

2 Answers

3 votes
3 x 60 (because there are 60 minutes in an hour) = 180. then 180 + 15 = 195. 500 / 195 = 2.6 miles per minute. 2.5 x 60 since its asking for miles per hour, is 156. he goes 156 miles per hour.
User DNKROZ
by
8.1k points
3 votes

Answer:

The mean rate of the racer is 153.8 mph.

Explanation:

It took a race car driver 3 hours and 15 minutes to go 500 miles.

Now converting 3 hours and 15 minutes to hours.

15 minutes =
(15)/(60)= 0.25 hours

Now, we have that it took a race car driver 3.25 hours to go 500 miles.

So, the average speed will be =
(500)/(3.25)=153.8 mph

Hence, the mean rate of the racer is 153.8 mph.

User Stephen Walcher
by
8.1k points