442 views
1 vote
A stone is thrown horizontally from the edge of a 50 meter high cliff. The stone lands

100 meters from the base of the cliff.
a) How long does it take for the stone to reach the bottom of the cliff?
b) With what horizontal speed was the stone thrown?

User Galenus
by
4.4k points

1 Answer

3 votes

Answer:

a) The stone takes 10.2 seconds to reach the bottom of the cliff.

b) The stone was thrown at a horizontal speed of 31.3 m/s

Step-by-step explanation:

Horizontal Motion

Suppose an object is launched horizontally with an initial speed v from a height h, the range or maximum horizontal distance traveled by the object can be calculated as follows:


\displaystyle d=v\cdot\sqrt{\frac {2h}{g}}

If we don't know the speed, we can solve the equation for v:


\displaystyle v=d\cdot\sqrt{\frac {g}{2h}}

Another useful formula allows us to calculate the distance traveled by the object in terms of the time:


\displaystyle y=(g.t^2)/(2)

To calculate the time the object takes to hit the ground, we solve the above equation for t:


\displaystyle t=\sqrt{(2y)/(g)}

a) The stone is thrown horizontally from a height h=50 m and it lands at a horizontal distance of d=100 m. Use the last equation to calculate the time taken to reach the ground. Note the distance traveled y is replaced by the total height h:


\displaystyle t=\sqrt{(2\cdot 50)/(9.8)}

t=10.2 s

The stone takes 10.2 seconds to reach the bottom of the cliff.

b) The initial speed is:


\displaystyle v=d\cdot\sqrt{\frac {g}{2h}}


\displaystyle v=100\cdot\sqrt{\frac {9.8}{2\cdot 50}}

v=31.3 m/s

The stone was thrown at a horizontal speed of 31.3 m/s

User David Pond
by
3.9k points