432,087 views
35 votes
35 votes
Pitcher's mounds are raised to compensate for the vertical drop of the ball as it travels a horizontal distance of 18 m to the catcher. (a) If a pitch is thrown horizontally with an initial speed of 32 m/s, how far does it drop by the time it reaches the catcher? (b) If the speed of the pitch is increased, does the drop distance increase, decrease, or stay the same? Explain. (c) If this baseball game were to be played on the Moon, would the drop distance increase, decrease, or stay the same? Explain.

User Mandar Pathak
by
2.3k points

1 Answer

11 votes
11 votes

For part a).

Horizontally

v=velocity=32 m/s

d=distance=18m

so we use the next formula to calculate the time


v=(d)/(t)

we isolate the time


t=(d)/(v)

we substitute the data


t=(18)/(32)=0.56s

For the drop distance, we use the next formula


d=v_it+(1)/(2)gt^2

where vi is the initial velocity, g is the gravity and d is the distance

vi=0 m/s

g=9.8 m/s^2

t=0.56 s


d=0(0.56)+(1)/(2)(9.8)(0.56)^2=1.54

b)

drop distance decreases because it will take less time for the baseball to travel to the catcher.

c)

Drop distance would decrease because the gravity is less on the moon and therefore the ball would not fall as fast

User Szymanowski
by
3.4k points