Final answer:
To find the time it takes for a baseball to reach home plate when thrown at 105 miles per hour, convert the speed to feet per second and divide the distance to home plate by this speed.
Step-by-step explanation:
To calculate how long it took the ball to travel from the pitcher to the batter, we need to use the formula time = distance / speed. First, convert the pitcher's throw of 105 miles per hour to feet per second. There are 5280 feet in a mile and 3600 seconds in an hour, so 105 miles per hour is the same as (105 * 5280) / 3600 feet per second. After performing the calculation, we then divide the distance between the pitcher's mound and home plate (60 feet, 6 inches, which is 60.5 feet) by the speed in feet per second to find the time it takes for the ball to reach home plate.