46,129 views
28 votes
28 votes
A pitcher throws a baseball horizontally from the mound to home plate. The ball falls 0. 857 m (2. 81 ft) by the time it reaches home plate 18. 3 m (60 ft) away. How fast was the pitchers pitch.

User Rob Hardy
by
2.5k points

1 Answer

19 votes
19 votes

Answer:

Baseball is commonly expressed in English units - we'll use that

(Rubber to home plate is 60' 6 '' but the ball is probably released at about 60 ft)

t = S / v time of fall and time to reach plate

H = 1/2 g t^2

t = (2 H / g)^1/2 we'll use 32.,2 ft / sec^2 for g

t = (2 * 2.81 / 32.2)^1/2 = .418 sec

v = 60 ft / .418 sec = 144 ft/sec

Since 60 mph = 88 ft/sec

v = 144 / 88 * 60 = 98 mph

User Lamrin
by
3.0k points