34.0k views
1 vote
Assuming the ball's initial velocity was 51 ∘ above the horizontal and ignoring air resistance, what did the initial speed of the ball need to be to produce such a home run if the ball was hit at a point 0.9 m (3.0 ft) above ground level? assume that the ground was perfectly flat. express your answer using two significant figures.

1 Answer

4 votes

horizontal distance of home run is 400 ft = 122 m

height of the home run is 3 ft = 0.9 m

now the angle of the hit is 51 degree

now we have equation of trajectory of the motion


x = vcos\theta * t


y = v sin\theta * t - (1)/(2) gt^2

solving above two equations we have


y = xtan\theta - (gx^2)/(2v^2cos^2\theta)

now here we will plug in all data


0.9 = 122 tan51 - (9.8 * 122^2)/(2*v^2 * cos^251)


0.9 = 150.65 - (184150.2)/(v^2)


(184150.2)/(v^2) = 149.75


v = 35.1 m/s

so the ball was hit with speed 35.1 m/s from the ground

User Qubitium
by
6.6k points