192k views
5 votes
During a 40-mile trip, Marla traveled at an average speed of x miles per hour for the first y miles of the trip and at an average speed of 1.25x miles per hour for the last 40 – y miles of the trip. The time that Marla took to travel the 40 miles was what percent of the time it would have taken her if she had traveled at an average speed of x miles per hour for the entire trip?

1 Answer

4 votes

Answer:


(32+0.2y)/(40)

Step-by-step explanation:

Total =40 miles.

y miles at x miles/hr.

40- y miles at 1.25 x miles/hr.\

We know that

Distance = time x velocity

For y miles

Lets time taken to cover y miles is t

y = x t

t=y/x -----------1

For 40- y

Lets time taken to cover 40- y is t'

40- y = 1.25 x t'

t'=(40-y)/1.25x -----------2

By adding both equation

t+t'=y/x + (40-y)/1.25x

t+t'=(32+0.2y)/x

Now lets time T taken by Marla if she will travel at x miles per hr for entire trip.

40 = x T

T=40/x

So


(t+t')/(T)=(32+0.2y)/(40)

To find numerical value of above expression we have to know the value of y.

User Not Joe Bloggs
by
6.3k points