96.7k views
1 vote
. A hawk flying at a height of 60 feet spots a rabbit on the ground. If the hawk dives at a speed of 55 feet per second, how

long will it take the hawk to reach the rabbit?
(Hint: A model for the vertical motion of a projected object is given by the equation h = -16t2 + vt + s, where h is the height
in feet, t is the time in seconds, v is the initial velocity in feet per second, and s is the starting height of the object in feet. Use
this equation to find the time taken by the hawk to reach the rabbit.

User Shaohao
by
5.8k points

1 Answer

0 votes

Answer:

The hawk reaches the rabbit at t=4.3 sec

Explanation:

Let

h ----> is the height in feet

t ----> is the time in seconds

v ---> the initial velocity in feet pr second

s ----- is the starting height

we have


h=-16t^(2) +vt+s

when the hawk reaches the rabbit the value of h is equal to zero

we have


v=55\ ft/sec


s=60\ ft

substitute


0=-16t^(2) +55t+60

Solve the quadratic equation by graphing

The solution is t=4.3 sec

see the attached figure

. A hawk flying at a height of 60 feet spots a rabbit on the ground. If the hawk dives-example-1
User Piercus
by
5.2k points