121k views
5 votes
An air traffic controller spots two airplanes at the same altitude converging to a point as they fly at right angles to each other. One airplane is 150 miles from the point and has a speed of 300 miles per hour. The other is 200 miles from the point and has a speed of 400 miles per hour.

(a) At what rate is the distances between the planes decreasing?
(b) How much time does the air traffic controller have to get one of the planes on a different flight path?

2 Answers

6 votes

Answer:

a) -500 mph

b) 1/2 h

Explanation:

a)
\frac{ (150(-300))+(200(-400))}{\sqrt{150x^(2) +200^(2) } }

b)
\frac{\sqrt{150^(2)+200^(2) } }{500}

User Sylvain Biehler
by
5.7k points
3 votes

Answer:

(a) D(t) = 250t -500 miles

(b) Controller has 2 hours, but including time for pilots to divert course or altitude.

Explanation:

Given:

two planes at same altitude heading in a collision course.

Plane A at 400 miles from collision point at 200 mph

Plane B at 300 miles from collision point at 150 mph.

Theoretical collision happens in

t = 400/200 = 300/150 = 2 hours

Distance ya of plane A from collision point as a function of time in hours

ya(t) = 400 -200t

Distance yb of plane B from collision point as a function of time in hours

yb(t) = 300-150t

(a) Distance between two planes,

Since the two planes are on courses perpendicular to each other, will need using pythagorean theorem

D(t) = sqrt(ya(t)^2+yb(t)^2)

= sqrt((400-200t)^2+(300-150t)^2)

= 250(t-2)

D(t) = 250t -500 miles

b. time available

Time until D(t) = 0

solve D(t) = 0

D(t) = 0

250(t-2) = 0

t = 2 (two hours)

User Xianshenglu
by
4.9k points