164k views
0 votes
A photographer in a helicopter ascending vertically at a constant rate of 11.5 m/s accidentally drops a camera out the window when the helicopter is 70.0 m above the ground.

A) How long will the camera take to reach the ground?
B) What will its speed be when it hits?

1 Answer

5 votes
The problem is a free fall problem as the camera drops. The vertical velocity is zero. It is the action of gravity that takes place here. To calculate the time it takes to reach the ground, we do as follows:

t = √2y/g = √(2)(70)/9.81 = 3.78 s

The velocity as it impacts the ground would be,

v = √2gy = 37.04 m/s

Hope this helps.
User Olivier Ma
by
6.3k points