130k views
0 votes
An object is thrown in the air with an initial velocity of 5 m/s from a height of 9 m. The equation h(t) = –4.9t2 + 5t + 9 models the height of the object in meters after t seconds. About how many seconds does it take for the object to hit the ground? Round your answer to the nearest hundredth of a second.

Question 11 options:

0.94 seconds


1.50 seconds


2.00 seconds


9.00 seconds

User Bitvale
by
7.0k points

2 Answers

0 votes
After being thrown, the object would take about 2 seconds to hit the ground. We can find this by looking at the second x-intercept, which is where the graphed equation crosses the line on the right.
User Vic Seedoubleyew
by
6.9k points
6 votes

Answer:

The time taken by the object to hit the ground is 2 seconds.

Explanation:

It is given that,

Initial velocity of an object, u = 5 m/s

Height, h = 9 m

The equation that models the height of the object in meters after t seconds is :


h(t)=-4.9t^2+5t+9

We have to find the time for the object to hit the ground.

i.e.
-4.9t^2+5t+9=0

On solving the above quadratic equation, we get the value of time t is :

t = 1.958 seconds

or

t = 2 seconds

Hence, the correct option is (c) " 2 seconds ".

User George Lee
by
6.3k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.