123k views
0 votes
An airplane 30,100 feet above the ground begins descending at the rate of 2,150 feet per minute. Write and solve a linear equation to find how long it will take the plane to reach the ground.

User Slartidan
by
8.5k points

2 Answers

6 votes
30,100= 2,150x
Total x=number per minute
divide 2,150 0n both sides
30,100= 2,150x
____
2,150
Cross it out. Your only left with x.
Then divide 30,100 by 2,150 = 14
x= 14 minutes it will take to completely descend
Your welcome.
User Dogcat
by
8.5k points
4 votes

Answer: 14 minutes

Explanation:

Given : An airplane 30,100 feet above the ground begins descending at the rate of 2,150 feet per minute.

i.e. Initial position of airplane = 30,100 feet above the ground

Speed = -2,150 feet per minute.

let x denotes the number of minutes .

Then the distance traveled by airplane in x minutes = Speed × Time

= -2150x

So the required linear equation :-


f(x)=30100-2150x , where f(x) is the distance from graound after x minutes.

When plane reaches the ground then f(x)=0.


\Rightarrow0=30100-2150x\\\\\Rightarrow\ 2150x=30100\\\\\Rightarrow\ x=(30100)/(2150)=14

Hence, it will take 14 minutes to reach the ground.

User Stephzcj
by
7.3k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories