165k views
3 votes
Throws a rock 53° up to a 20-meter cliff that is 15 meters away. Find the initial speed which makes the rock land at the edge of the cliff.

a) 10 m/s
b) 15 m/s
c) 20 m/s
d) 25 m/s

User Che Kofif
by
7.0k points

1 Answer

3 votes

Final answer:

To determine the initial speed needed for a rock to land at the edge of a cliff when thrown at a specific angle, separate the motion into horizontal and vertical components and use kinematic equations to solve for the initial velocity. Without the exact calculations, we cannot definitively select the correct answer from the given options.

Step-by-step explanation:

To find the initial speed that makes the rock land at the edge of a 20-meter cliff when thrown at a 53° angle, 15 meters away, we need to separate the motion into horizontal and vertical components and use kinematic equations. Assuming negligible air resistance, the rock's horizontal motion will be constant, while the vertical motion will be accelerated due to gravity (-9.8 m/s2).

First, we calculate the time it takes for the rock to reach the horizontal distance of 15 meters. The horizontal component of the initial velocity (Vx) is V₀ × cos(θ), where V₀ is the initial speed, and θ is the angle of 53°. The horizontal distance (Dx) is covered at a constant speed, so Dx = Vx × t, where t is the time.

For the vertical motion, the rock must reach a height of 20 meters. The vertical component of the initial velocity (Vy) is V₀ × sin(θ). We can use the formula for the final position in vertical motion, Dy = Vy × t - (½ × g × t2), where Dy is the change in vertical position (20 meters in this case), and g is the acceleration due to gravity. By solving these equations simultaneously, we will find the initial speed.

User Erald Karakashi
by
7.1k points