3.5k views
4 votes
Sam drove for 2 hours at 45 mph. He has the same distance remaining. How fast should he drive to make his average speed 60 mph?

User Arne S
by
4.9k points

2 Answers

3 votes
2 hours at 45 mph gives a distance of 90 miles
he has another 90 miles to travel
total distance=rate*(total time)
his average speed must be 60 mph
180=60*(2+t) where t is the amount of time he has to travel the other 90 miles
180=120+60t
180-120=60t
60=60t
t=60/60
t=1 hour
Sam must travel 90 miles in 1 hour, thus Sam must travel 90 miles per hour to have an average of 60 mph
90 mph is the answer

you can check by using the harmonic mean if you understand what it is:
(2*45*90)/(45+90)=8100/135=60

you can also find the answer using the harmonic mean:
(2*45*R)/(45+R)=60
(2*45*R)=60*(45+R)
90R=2700+60R
90R-60R=2700
30R=2700
R=2700/30
R=90 mph must be Sam's speed for the other 90 miles to average 60 mph
User Sir Celsius
by
4.9k points
5 votes

Answer:

90 mph

Explanation:

One way distance:

2 × 45 = 90 miles

Total distance:

2 × 90 = 180 miles

Speed = distance/time

60 = 180/time

time = 3 hours

2 hours already driven.

He needs to cover the other 90 miles in 1 hour:

speed = 90 miles/hour

User Jamiebarrow
by
5.0k points