233k views
2 votes
4) A plane flying horizontally at an altitude of 1 mile and a speed of 420 mile/hr passes directly over a radar station. Find the rate at which the distance from the plane to the station is increasing when it is 4 miles away from the station. (Round your answer to the nearest whole number.) (10 points) You must show all your work.

User WebbH
by
8.2k points

2 Answers

2 votes

Final answer:

Using the Pythagorean theorem and differentiation, the rate at which the distance from a plane at an altitude of 1 mile and a speed of 420 miles/hr to a radar station is increasing when the plane is 4 miles away is found to be approximately 215 miles/hr.

Step-by-step explanation:

To solve the problem of finding the rate at which the distance from a plane flying horizontally at an altitude of 1 mile and a speed of 420 miles/hr to a radar station is increasing when the plane is 4 miles away from the station, we can use the Pythagorean theorem.

The plane's altitude forms a right triangle with the plane's horizontal distance from the radar and the diagonal distance to the radar station.

Let d be the diagonal distance from the plane to the radar station and x be the horizontal distance from the radar station to the point of the plane directly above.

Given that the altitude (y) is 1 mile, we have:

d² = x² + y²

When the diagonal distance is 4 miles, we get:

4² = x² + 1²

16 = x² + 1

15 = x²

x = √15 (≈ 3.87 miles)

The rate of change of x is given by the speed of the plane, which is 420 miles/hr.

To find the rate of change of d, we differentiate the equation concerning time t: 2dddt = 2xdxdt

Plugging in the known values, we get:

2(4)d'/dt = 2(√15)(420)

8d'/dt = 420 √15

d'/dt = (420 √15) / 8

d'/dt = 52.5 √15

After rounding to the nearest whole number, the rate at which the distance is increasing is 215 miles/hr.

User Fronzee
by
6.8k points
3 votes

Final answer:

The rate at which the distance from the plane to the radar station is increasing when the plane is 4 miles away is approximately 406 miles per hour after applying the Pythagorean theorem and differentiating with respect to time.

Step-by-step explanation:

The problem presents a scenario where a plane flying horizontally at an altitude of 1 mile with a speed of 420 miles per hour passes over a radar station. We need to find the rate at which the distance from the plane to the radar station increases when the plane is 4 miles away from the station. We can approach this problem using the Pythagorean theorem in a right triangle, where the altitude represents one leg, the horizontal distance from the radar to the plane above represents the other leg, and the slant distance from the radar to the plane represents the hypotenuse.

Let x represent the horizontal distance from the radar to the plane, and y represent the slant distance from the radar to the plane. When x = 4 miles, the altitude is constant at 1 mile. The Pythagorean theorem gives us y^2 = x^2 + 1. To find dy/dt when x = 4 miles, we differentiate both sides of the equation with respect to time t:

2y(dy/dt) = 2x(dx/dt)

Plugging in the values of x = 4 miles and dx/dt = 420 miles/hr (the speed of the plane), and knowing that y can be found using the Pythagorean theorem:

y = √(x² + 1²) = √(4² + 1²) = √17

We get:

2(√17)(dy/dt) = 2(4)(420)

(dy/dt) = 4(420) / √17

After calculating, we round to the nearest whole number for (dy/dt), which is the rate at which the distance from the plane to the station is increasing when the plane is 4 miles away from the station.

So, (dy/dt) ≈ 406 miles per hour.

User Alexander Sorokin
by
8.3k points