86.0k views
0 votes
A geologist throws their rock hammer off a 133 m cliff with a horizontal velocity of 22 m/s. How far from the base of the cliff did the hammer land?

User Llm
by
8.3k points

1 Answer

5 votes

Final answer:

To find out how far the hammer landed from the base of the cliff, we calculate the time it took to hit the ground and then multiply it by the horizontal velocity. The hammer fell for approximately 5.2 seconds and thus traveled around 114.4 meters horizontally from the base of the cliff.

Step-by-step explanation:

The subject of this question is Physics, specifically dealing with projectile motion and kinematics. To determine how far from the base of the cliff the hammer landed, we need to consider two components of the motion: the horizontal motion, which is unaffected by gravity, and the vertical motion, which is affected by gravity. The horizontal velocity is 22 m/s and the height of the cliff is 133 m.

Firstly, we calculate the time it takes for the hammer to reach the ground. Since the vertical motion is a free fall situation, we can use the equation for vertical displacement:

s = ut + 0.5at2

Where:

  • s = vertical displacement (133 m)
  • u = initial vertical velocity (0 m/s, since it's thrown horizontally)
  • a = acceleration due to gravity (approx. 9.8 m/s2)
  • t = time (unknown)

By substituting the known values, we get:

133 = 0 + 0.5 * 9.8 * t2

t = sqrt(133 / 4.9)

t = 5.2 seconds (approx.)

Now, we calculate the horizontal distance traveled using the horizontal velocity:

Horizontal distance = horizontal velocity * time

Horizontal distance = 22 m/s * 5.2 s

Horizontal distance = 114.4 meters (approx.)

So, the hammer would land approximately 114.4 meters from the base of the cliff.

User Alex Aza
by
9.0k points