217k views
4 votes
A ball is thrown horizontally and travels 100.0 m. The ball is released 1.90 m from the ground. What speed must the ball have been traveling at? Ignore friction.

a. 10.0 m/s

b. 15.0 m/s

c. 20.0 m/s

d. 25.0 m/s

User Mpark
by
7.3k points

1 Answer

5 votes

Final answer:

To determine the speed at which the ball must have been traveling, we can use the equations for horizontal and vertical motion. By manipulating the equations, we find that the ball must have been traveling at approximately 158.73 m/s.

Step-by-step explanation:

To determine the speed at which the ball must have been traveling, we can use the equation for horizontal motion: distance = velocity x time. Since the ball is thrown horizontally, its initial vertical velocity is zero and its final vertical velocity is also zero, due to neglecting air resistance. Therefore, the time it takes to travel the horizontal distance of 100.0 m is the same as the time it takes for the ball to fall 1.90 m due to gravity. Using the equation for vertical motion: height = (1/2) x gravity x time^2, we can solve for the time. Plugging in the given values, we get time = sqrt((2 x height) / gravity) = sqrt((2 x 1.90 m) / 9.81 m/s^2) = 0.63 seconds.

To calculate the initial horizontal velocity, we can use the equation for horizontal motion: velocity = distance / time. Plugging in the given values, we get velocity = 100.0 m / 0.63 s = 158.73 m/s.

Therefore, the ball must have been traveling at a speed of approximately 158.73 m/s in order to travel 100.0 m horizontally and land 1.90 m lower.

User John Fonseka
by
7.6k points