202k views
2 votes
an airplane flying at an altitude of 6 miles and a speed of 300 miles/hour passes directly over a radar antenna. how fast is the distance between the airplane and the antenna increasing when that distance is 10 miles?

1 Answer

1 vote

Final answer:

The distance between the airplane and the radar antenna is increasing at a rate of 5/3 miles per hour when the distance is 10 miles.

Step-by-step explanation:

To find how fast the distance between the airplane and the radar antenna is increasing, we can use the concept of related rates. Let's call the distance between the airplane and the antenna 'x' and the time 't'. We have two rates given: the altitude of the airplane is increasing at a rate of 6 miles/hour (since it's stated as an altitude), and the speed of the airplane is given as 300 miles/hour.

We can create a right triangle with the antenna as the base and the airplane as the hypotenuse. The height of the triangle is the altitude of the airplane, which is 6 miles. Using the Pythagorean theorem, we can write the equation x^2 + 6^2 = t^2 (where x is the distance between the airplane and the antenna).

Differentiating both sides of the equation with respect to time, we get 2x(dx/dt) = 2t(dt/dt). Solving for dx/dt, we find that dx/dt = xt/(6t) = x/6

Now we can substitute x = 10 miles into the equation to find dx/dt when the distance is 10 miles. dx/dt = 10/6 = 5/3 miles/hour.

User Waylonion
by
7.8k points