Final answer:
The pitch took approximately 0.3904 seconds to travel from the pitcher's mound to home plate.
Step-by-step explanation:
To find the time it takes for a pitch to travel from the pitcher's mound to home plate, we need to convert the distance to the same unit as the velocity. The distance is 60 feet and 6 inches, which is equivalent to 60 + (6 / 12) = 60.5 feet. Since 1 mile is equal to 5280 feet, we can convert the distance to miles by dividing by 5280: 60.5 / 5280 = 0.0114 miles.
Now, we can use the formula time = distance / velocity, where the distance is in miles and the velocity is in miles per hour. Plugging in the values, we get time = 0.0114 miles / 105.1 miles/hour.
To convert miles/hour to miles/second, we need to divide by 3600 (the number of seconds in an hour): 105.1 / 3600 = 0.0292 miles/second.
Now, we can divide the distance by the velocity: 0.0114 miles / 0.0292 miles/second = 0.3904 seconds.
Therefore, the pitch took approximately 0.3904 seconds to travel from the pitcher's mound to home plate.