47.6k views
2 votes
A parachutist with a camera descends in free fall at a speed of 10 m/s.

User Spudwaffle
by
7.8k points

1 Answer

5 votes

Final answer:

The camera takes approximately 2.3 seconds to reach the ground when free fall at a speed of 10m/s and the parachutist releases the camera at an altitude of 50m

Step-by-step explanation:

The question states that a parachutist descends in free fall at a speed of 10 m/s. To find the time it takes for the camera to reach the ground, we can use the equation of motion S = ut + 1/2at² where S is the distance covered by the object, u is the initial velocity, a is the acceleration, and t is the time taken.

Substituting this value in the formula to get the time we have;

50 = 10t + 1/2(10)t²

50 = 10t + 5t²

5t²+10t-50 = 0

Dividing through by 5 we have;

t²+2t-10 =0

Factorizing and using the general formula we have;

-2±√4+40/2

-2±√44/2

-2+√44/2

t = 2.3seconds

Therefore it takes the camera 2.3 seconds to reach the ground.

Your question is incomplete but most probably your full question was

A parachutist with a camera descends in free fall at a speed of 10m/s. The parachutist releases the camera at an altitude of 50m. How long does it take the camera to reach the ground?

User Shnatsel
by
7.8k points