Final answer:
The ball was in the air for approximately 3.21 seconds and it was moving at a speed of approximately 40.06 m/s when it left the bat.
Step-by-step explanation:
To calculate the time of flight, we can use the vertical motion equation: d = v*t + 0.5*a*t^2. Since the ball lands at the same height it was hit, the vertical displacement is zero. We can rearrange the equation to solve for time: t = -v_y/a. Substituting the values, we have t = -43*sin(43)/(-9.8) ≈ 3.21 seconds.
To calculate the initial velocity, we can use the horizontal motion equation: d = v*t. Rearranging the equation to solve for velocity, we have v = d/t. Substituting the values, we have v = 128.5/3.21 ≈ 40.06 m/s.