24.2k views
4 votes
A truck drove at an average rate of 45 miles per hour on the highway and 20 miles per hour once it entered the city. If it took the trip, which was 100 miles, in 3 hours, how much of it was in the city?

This is for 10 points

User Omar Fayad
by
6.1k points

2 Answers

6 votes

Answer:

1.4*20=28; 28 miles

Explanation:

User IDroid
by
6.4k points
4 votes

Answer:

1.4 hours was in the city.

Explanation:

Let
h be the amount of time the truck drove on the highway, and
c be the amount of time it drove in the city.

Since the total distance the truck traveled was 100 miles, we have


45h+20c=100 this says the distances the truck traveled on the highway plus the distance it traveled in the city is 100 miles.

And since the truck traveled for a total of 3 hours, we have:


h+c=3

Thus we have two equations and two unknowns:

(1).
45h+20c=100

(2).
h+c=3

We solve this system by first solving for
h from equation (2) and substituting it into equation (1):


h=3-c


45(3-c)+20c=100


135-45c+20c=100


135-25c=100


\boxed{c=1.4}

the time in the city was 1.4 hours.

And the time on the highway was from equation (2)


h+1.4=3


\boxed{h=1.6}

User Las Ten
by
6.3k points