103k views
4 votes
A plane is traveling horizontally at a speed u = 40 m/s at a height h =

100m above ground when it drops a package, as shown in the figure.
How long it takes the dropped package to reach the ground and where?

User Megaroeny
by
3.2k points

1 Answer

4 votes

Answer:

First get the time in the air from

h = gt2/2 ⇒

t = √(2h/g)

Plug in

h = 100 m

g = 9.8 m/s2

and get t in seconds. Then multiply t by the constant horizontal speed.

x = vt

Plug in

v = 40 m/s

t = ? (result from above)

and get x in meters.

(b)

Vertical speed:

v(t) = v0 + gt

Plug in

v0 = 0

g and t from above

and get v(t) in m/s.

(c) It is the same as the initial velocity when released, which is 40 m/s.

(d) The horizontal component is the same as the answer for part (c). The vertical component is the result for part (b), in the downward direction.

User Jeff Martin
by
3.3k points