232k views
3 votes
A pitcher throws a baseball at 45.15 m/s 45.15m/s. If a pitch were thrown horizontally with this velocity, how far would the baseball fall vertically by the time it reached home plate that is 17.44m away?

A) 11.2 m
B)13.5m
C) 17.4m
D) 22.4m

User Bick
by
6.7k points

1 Answer

2 votes

Final answer:

The baseball would fall vertically by approximately 13.5 meters by the time it reaches home plate.

Step-by-step explanation:

To determine how far the baseball would fall vertically by the time it reached home plate, we can use the distance formula for vertically accelerated motion:

distance = (1/2) × acceleration × time²

In this case, the acceleration would be due to gravity, which is approximately 9.8 m/s². The time it takes for the baseball to reach home plate can be calculated using the horizontal velocity of the pitch and the distance between the pitcher's mound and home plate. Once we have the time, we can plug it into the equation to find the distance:

distance = (1/2) × 9.8 × time²

Using the given horizontal velocity of 45.15 m/s and the distance of 17.44 m, we can calculate the time it takes for the baseball to reach home plate:

  • time = distance / velocity = 17.44 / 45.15

Plugging this value into the equation, we can calculate the vertical distance:

  • distance = (1/2) × 9.8 × (time)²

Calculating this, we find that the baseball would fall vertically by approximately 13.5 meters by the time it reaches home plate.

User Bhavya Arora
by
7.3k points