40.3k views
3 votes
Traveling at an average speed of 50 miles per hour, the trip from point A to point B takes 2 hours. Traveling at an average speed of 40 miles per hour, the same trip takes 2.5 hours.If time, y, varies inversely with speed, x, how long will the trip take traveling at 45 miles per hour?

User Netwons
by
7.2k points

1 Answer

5 votes

Answer:


2(2)/(9) hrs

Explanation:

At a speed of 50m/h time taken is 2hours

At a speed of 45m/h time take is 2.5 hours

You can find the distance between A and B

Distance=speed × time


D=50*2=100\\

But time y varies inversely with speed x , hence


y=(k)/(x)

where k is a constant of proportionality and D is distance.

To find k


2=(k)/(50) \\\\2*50=k\\\\100=k\\\\y=(100)/(x)

where y is time and x is speed

Given

x=45m/h you should find time y. Apply the expression above


y=(100)/(x) \\\\\\y=(100)/(45) =(20)/(9) =2(2)/(9) hrs

User Dzumret
by
7.3k points