16.3k views
5 votes
An object is thrown downward with an initial velocity of 12 m/s off a cliff that is 420 m high. Use the formula d(t) = 4.9t^2 + 12t. d(t) is the distance fallen. How long does it take for the object to hit the ground.

1 Answer

7 votes

We know the distance the object will travel is equal to the height of the cliff. Therefore,


d(t)=4.9t^2+12t^{}\rightarrow420=4.9t^2+12t

We then clear t. Notice we get a quadratic equation


\begin{gathered} 4.9t^2+12t=420\rightarrow4.9t^2+12t-420=0 \\ \text{Solve using the general formula} \\ x=\frac{-12\pm\sqrt[]{12^2-4(4.9)(-420)}}{2(4.9)} \\ \rightarrow x=\frac{-12\pm\sqrt[]{8376}}{9.8} \\ \rightarrow x_1=8.11s \\ \rightarrow x_2=-10.56s \end{gathered}

We discard the negative solution, because negative time doesn't exist.

Therefore, it takes the object 8.11s to hit the ground.

User Ronapelbaum
by
6.4k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.