4.4k views
2 votes
PLEASE HELP

will be giving brianlist

Terrell drove to the mountains at an average
of 40 miles per hour. His return trip, by the
same roads, averaged 50 miles per hour. His
total driving time was 9 hours. How far did
he drive one way?

1 Answer

2 votes

Answer:

Terrell drove 200 miles one way

Explanation:

Since his trip concluded at 9 hours, let's say that x = time traveled at 40 mph

and 9 - x = time traveled at 50 mph

D = (rate) x (time)

40x = 50(9 - x) -----Use Distributive property on the right side of the equation

40x = 450 - 50x -----Add 50x on both sides

90x = 450 ------Divide by 90 on both sides

We end up with x = 5 which is the value we need to determine the distance of our first trip, calculated by

D = 40x, or 40(5) = 200 mi

User Jose Gonzalez
by
8.4k points