Final answer:
Using the Pythagorean theorem and differentiation, the rate at which the distance from a plane at an altitude of 1 mile and a speed of 420 miles/hr to a radar station is increasing when the plane is 4 miles away is found to be approximately 215 miles/hr.
Step-by-step explanation:
To solve the problem of finding the rate at which the distance from a plane flying horizontally at an altitude of 1 mile and a speed of 420 miles/hr to a radar station is increasing when the plane is 4 miles away from the station, we can use the Pythagorean theorem.
The plane's altitude forms a right triangle with the plane's horizontal distance from the radar and the diagonal distance to the radar station.
Let d be the diagonal distance from the plane to the radar station and x be the horizontal distance from the radar station to the point of the plane directly above.
Given that the altitude (y) is 1 mile, we have:
d² = x² + y²
When the diagonal distance is 4 miles, we get:
4² = x² + 1²
16 = x² + 1
15 = x²
x = √15 (≈ 3.87 miles)
The rate of change of x is given by the speed of the plane, which is 420 miles/hr.
To find the rate of change of d, we differentiate the equation concerning time t: 2dddt = 2xdxdt
Plugging in the known values, we get:
2(4)d'/dt = 2(√15)(420)
8d'/dt = 420 √15
d'/dt = (420 √15) / 8
d'/dt = 52.5 √15
After rounding to the nearest whole number, the rate at which the distance is increasing is 215 miles/hr.