232k views
5 votes
5. An airplane at 30,100 feet above the ground begins descending at the rate of 2,150 feet per minute.

Write and solve a linear equation to find how many minutes it will take the plane to reach the ground.

User Pkacprzak
by
8.1k points

1 Answer

4 votes

Answer:

h(t) = -2150t + 30100; 14 minutes to reach ground

Explanation:

We can make height a function of time.

Because we know the plane descends, the slope (in this case 2150) must be negative. The y-intercept is 30100 since at time 0, the plane is 30100 feet in the air.

Plugging the information in, our equation then is h(t) = -2150t + 30100

To find how long it takes the plane to reach the ground (i.e., 0 feet), we simply set the function equal to 0 and solve for t:


0=-2150t+30100\\-30100=-2150t\\14=t

Thus, the plane reaches the ground in 14 minutes.

User Greg Giacovelli
by
8.1k points