246,172 views
4 votes
4 votes
You throw a baseball upward with an initial velocity of 35 feet per second. The height h (in feet) of the baseball relative to your glove is modeled by the position function

h(t) = −16t² + 35t,
where t is the time in seconds. How long does it take for the ball to reach your glove? (Round your answer to one decimal place.)

User Henry Ward
by
2.8k points

1 Answer

19 votes
19 votes

Answer:

35/16 sec ( 2 3/16 sec)

Explanation:

This assumes you glove is at height = 0

0 = -16t^2 + 35t

0 = t ( -16t + 35) shows t = 0 ( before you throw it) and t = 35/16 sec

User Bertram Gilfoyle
by
2.6k points