49.7k views
4 votes
A ball is dropped from the top of a building with a height of 500 feet. The height of the ball at time, t, can be modeled by the function h(t)=-4.9t^2 + h_0 where h_0 is the initial height of the ball. Approximately how long does it take for the ball to reach the ground?

User Ofek Gila
by
8.6k points

1 Answer

3 votes
In this problem, your h_o is 500. Put this into your equation to make it:

h(t) = -4.9t^2 + 500

Next, you want to know how long it will take for the ball to reach the ground. The ground will then be equal to 0 in this problem, so you will simply set your equation equal to 0 and solve for x:

0 = 4.9t^2 + 500

Now, all you have to do is solve the equation for t, and this will be your answer.
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories