195k views
3 votes
One of the fastest pitches ever thrown in major league baseball was by aroldis chapman and had a velocity of 105. 1 miles/hour. How many seconds did it take this pitch to travel the 60 feet and 6 inches from the pitcher's mound to home plate? (1 mile = 5280 feet).

2 Answers

3 votes

Final answer:

The pitch took approximately 0.3904 seconds to travel from the pitcher's mound to home plate.

Step-by-step explanation:

To find the time it takes for a pitch to travel from the pitcher's mound to home plate, we need to convert the distance to the same unit as the velocity. The distance is 60 feet and 6 inches, which is equivalent to 60 + (6 / 12) = 60.5 feet. Since 1 mile is equal to 5280 feet, we can convert the distance to miles by dividing by 5280: 60.5 / 5280 = 0.0114 miles.

Now, we can use the formula time = distance / velocity, where the distance is in miles and the velocity is in miles per hour. Plugging in the values, we get time = 0.0114 miles / 105.1 miles/hour.

To convert miles/hour to miles/second, we need to divide by 3600 (the number of seconds in an hour): 105.1 / 3600 = 0.0292 miles/second.

Now, we can divide the distance by the velocity: 0.0114 miles / 0.0292 miles/second = 0.3904 seconds.

Therefore, the pitch took approximately 0.3904 seconds to travel from the pitcher's mound to home plate.

User Mahmudul Hasan
by
7.5k points
1 vote

Answer:

V = 105.1 miles / hr = 105.1 miles * 5280 ft/mile / (1 hr * 3600 sec / hr)

V = 154.1 ft/sec

t = S / V = 60.6 ft / 154.1 ft/sec = .393 sec

User Nazar Vynnytskyi
by
8.1k points