67.5k views
3 votes
An airplane flies at 40 m/s at an altitude of 500 m. The pilot drops a package that falls to the ground, How long does it take for the package to reach the ground?

User Tomoyo
by
7.1k points

1 Answer

5 votes

Answer:

10.1 s

Step-by-step explanation:

The time of flight of the package is entirely determined by its vertical motion, which is a free fall motion (it is acted upon the force of gravity only). Therefore, it is a uniformly accelerated motion, so we can use the suvat equation:


s=ut+(1)/(2)gt^2

where

s is the vertical distance covered

u is the initial vertical velocity

t is the time of flight


g=9.8 m/s^2 is the acceleration due to gravity

For the package in this problem, we have:

s = 500 m is the vertical distance covered

u = 0 is the initial vertical velocity (initially it has only horizontal motion)

So, solving for t, we find the time of flight:


t=\sqrt{(2s)/(g)}=\sqrt{(2(500))/(9.8)}=10.1 s

User Dennissv
by
8.3k points