Given :
Initial velocity, u = 12.5 m/s.
Height of camera, h = 64.3 m.
Acceleration due to gravity, g = 9.8 m/s².
To Find :
How long does it take the camera to reach the ground.
Solution :
By equation of motion :
![h = ut+(gt^2)/(2)](https://img.qammunity.org/2021/formulas/physics/high-school/5po90az2blixjrk2nk3nj324m2pdk90zgd.png)
Putting all given values, we get :
![12.5t+(9.8t^2)/(2)=64.3\\\\4.9t^2+12.5t=64.3](https://img.qammunity.org/2021/formulas/physics/high-school/6vxnvvdj1fmjqvcemmgc8n2iacfjdzze1k.png)
t = 2.56 and t = −5.116.
Since, time cannot be negative.
t = 2.56 s.
Therefore, time taken is 2.56 s.
Hence, this is the required solution.