2 hours at 45 mph gives a distance of 90 miles
he has another 90 miles to travel
total distance=rate*(total time)
his average speed must be 60 mph
180=60*(2+t) where t is the amount of time he has to travel the other 90 miles
180=120+60t
180-120=60t
60=60t
t=60/60
t=1 hour
Sam must travel 90 miles in 1 hour, thus Sam must travel 90 miles per hour to have an average of 60 mph
90 mph is the answer
you can check by using the harmonic mean if you understand what it is:
(2*45*90)/(45+90)=8100/135=60
you can also find the answer using the harmonic mean:
(2*45*R)/(45+R)=60
(2*45*R)=60*(45+R)
90R=2700+60R
90R-60R=2700
30R=2700
R=2700/30
R=90 mph must be Sam's speed for the other 90 miles to average 60 mph