138k views
5 votes
A baseball is thrown from a height of 5 feet. The height, h, of the ball at time t seconds is modeled by the equationh(t) = -1662 +100% +5. How long will it take the ball to reach the ground?Select one:O a. 5.8 seconds.O b. 6.3 seconds,O c. 7.2 seconds.O d. 7.6 seconds

User Shaunna
by
5.1k points

1 Answer

6 votes

Solution

For this case we have the following situation:


h(t)=-16t^2+100t+5

And we want to find the time where h(t)=0 so we can do this:


-16t^2+100t-5=0

Solving for t we got:


t=\frac{-100\pm\sqrt[]{(100)^2-4(-16)(-5)}}{2\cdot(-16)}

Then the two possible solutions are:

t= 6.299 s or t= -0.049 s

Then the answer is:

O b. 6.3 seconds,

User M Soegtrop
by
4.9k points