35.0k views
4 votes
A ball is thrown horizontally and travels 100.0 m. The ball is released 1.90 m from the ground. What speed must the ball have been traveling at? Ignore friction.

User Matteo NNZ
by
8.8k points

1 Answer

6 votes

Final answer:

To find the initial speed of a horizontally thrown ball that travels a certain distance before hitting the ground, we apply the equations for projectile motion, finding the time of flight from the release height and then dividing the horizontal distance by that time.

Step-by-step explanation:

The student's question involves calculating the initial speed of a ball thrown horizontally from a height, given the horizontal distance traveled before it hits the ground. This can be solved using the principles of projectile motion.

Firstly, to determine the time of flight, we use the vertical motion under gravity equation: h = 1/2gt^2, where h is the height from which the ball is released (1.90 m), and g is the acceleration due to gravity (9.81 m/s^2). Solving for t, we get the time it takes for the ball to hit the ground. With the time of flight known, the initial speed (horizontal velocity) can then be calculated by distance/time as the horizontal motion is at a constant velocity.

User Hieu Rocker
by
8.4k points