55.1k views
4 votes
Driving to your friend's house, you travel at an average rate of 35 miles per hour. On your way home, you travel at an average rate of 40 miles per hour. If the round trip took you 45 minutes, how far is it from your house to your friend's house?

2 Answers

5 votes

Answer:

18

Step-by-step explanation:

took the lesson

User Ivan Kvasnica
by
4.2k points
5 votes

Answer:

14 miles

Step-by-step explanation:

The relation that will be used to solve this problem is:


Distance = Velocity*Time

which can be rewritten as:


Time = (Distance)/(Velocity)

Assume that:

Distance from your house to your friend's house is d

Time taken from your house to your friend's house is t₁

Time taken from your friends house to your house is t₂

1- From your house to your friend's house:

Average rate = 35 miles/hour

Therefore:


t_(1) = (d)/(35)

2- From your friend's house to your house:

Average rate = 40 miles per hour

Therefore:


t_(2) = (d)/(40)

3- Round trip:

We know that the round trip took 45 minutes which is equivalent to 0.75 hours

This means that:

t₁ + t₂ = 0.75 hours


(d)/(35)+(d)/(40)=0.75\\ \\(40d + 35d)/(1400)=0.75\\  \\(75d)/(1400)=0.75\\  \\ 1050 = 75d\\ d=14 miles

Hope this helps :)

User Dave Olson
by
5.0k points