225k views
1 vote
Many major-league baseball pitchers can throw the ball at 90 miles per hour. At that speed, how long does it take a pitch to travel from the pitcher’s mound to home plate, a distance of 60 feet 6 inches? Give your answer to the nearest hundredth of a second. There are 5280 feet in a mile and 12 inches in a foot.

User HenrikP
by
8.6k points

2 Answers

7 votes

To solve this problem, we must first convert the distance into miles:

distance = 60 ft (1 mile / 5280 ft) + 6 inches (1 foot / 12 inches) (1 mile / 5280 ft)

distance = 0.0114583 mile

To calculate for time:

time = distance / speed

time = 0.0114583 mile / (90 miles / 3600 seconds)

time = 0.458 seconds

time = 0.46 seconds

User Youssef Egla
by
8.2k points
3 votes

Answer: It would take 0.46 second to travel from the pitcher's mound to home plate.

Explanation:

Since we have given that

Speed at which baseball pitchers can throw the ball = 90 miles per hour

Distance covered = 60 feet 6 inches

As we know that

1 mile = 5280 feet

and 1 foot = 12 inches

6 inches =
(6)/(12)=(1)/(2)=0.5\ feet

So, total feet would be 60 feet +0.5 feet = 60.5 feet

Now,

1 feet =
(1)/(5280)\ miles

So, 60.5 feet =
(60.5)/(5280)=0.011\ miles

so, Time taken by pitch to travel from the pitcher's mound to home plate is given by


(Distance)/(Speed)=(0.011)/(90)=0.00012\ hours\\\\0.00012* 3600\ seconds=0.458\ second\approx 0.46\ second

Hence, it would take 0.46 second to travel from the pitcher's mound to home plate.

User Malexmave
by
8.1k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories