190k views
2 votes
A baseball is thrown at an angle of 20° relative to the ground at a speed of 25 m/s if the ball was caught 50 m from the thrower how long was it in the air ?

2.1 s
0.5 s
10 s
5 s

User Ghufranne
by
8.0k points

1 Answer

1 vote

Answer:

2.1 s

Step-by-step explanation:

The motion of the ball is a projectile motion. We know that the horizontal range of the ball is


d = 50 m

And that the initial speed of the ball is


u=25 m/s

at an angle of


\theta=20^(\circ)

So, the horizontal speed of the ball (which is constant during the entire motion) is


u_x = u cos \theta = 25 \cdot cos 20^(\circ) = 23.5 m/s

And since the horizontal range is 50 m, the time taken for the ball to cover this distance was


t=(d)/(u_x)=(50)/(23.5)=2.1 s

which is the time the ball spent in air.

User Pavel Levin
by
7.8k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.