26.2k views
5 votes
If a baseball is batted at an angle of 35° to

the ground, the distance the ball travels


can be estimated using the equation


d = 0.0034s2 + 0.004s – 0.3, where s is the


bat speed, in kilometres per hour, and d is


the distance flown, in metres. At what speed


does the batter need to hit the ball in order


to have a home run where the ball-flies


125 m? Round to the nearest tenth.

User EJJ
by
5.9k points

1 Answer

7 votes

The batter need to hit the ball at a speed 191.4 kmph

Explanation:

We have distance flown, in meters

d = 0.0034s² + 0.004s - 0.3

where s is the bat speed, in kilometres per hour.

We need to find s when d = 125 m.

Substituting

d = 0.0034s² + 0.004s - 0.3

125 = 0.0034s² + 0.004s - 0.3

0.0034s² + 0.004s - 125.3 = 0

s = 191.38 kmph or s = -192.56 kmph ( Not possible)

The batter need to hit the ball at a speed 191.4 kmph

User Armatus
by
5.6k points