80.9k views
0 votes
A man is standing on the edge of a 20.0 m high cliff. He throws a rock horizontally with an initial velocity of 10.0 m/s.

a. How long does it take to reach the ground?
b. How far does the rock land from the base of the cliff?

User Meomeomeo
by
4.9k points

1 Answer

2 votes

Answer:

a. t = 2.02 s

b. d = 20.2 m

Step-by-step explanation:

Horizontal Motion

If an object is thrown horizontally from a height h with a speed v, it describes a curved path ruled exclusively by gravity until it eventually hits the ground.

The time the object takes to hit the ground can be calculated as follows:


\displaystyle t=\sqrt{(2h)/(g)}

The time does not depend on the initial speed.

The range or maximum horizontal distance traveled by the object can be calculated by the equation:


\displaystyle d=v.t

The man standing on the edge of the h=20 m cliff throws a rock with an initial horizontal speed of v=10 m/s.

a.

The time taken by the rock to reach the ground is:


\displaystyle t=\sqrt{(2*20)/(9.8)}


\displaystyle t=√(4.0816)

t = 2.02 s

b.

The range is:


\displaystyle d=10\cdot 2.02

d = 20.2 m

User Billmcc
by
4.8k points