183k views
3 votes
A pitcher throws a baseball 90.0 miles per hour. Convert this speed into feet per second. (1 mile = 5,280 feet)

User Monie
by
5.0k points

1 Answer

3 votes

Hello User,

Answer/Step:

1 mile = 5280 ft...so 90 miles = 90 * 5280 = 475200 ft

60 seconds in 1 minute...and 60 minutes in 1 hr = 60 * 60 = 3600 sec/hr

475200/3600 = 132 ft per second...so 90 miles per hr = 132 ft per second.

so the pitcher on the Robins throws faster at 132 ft/sec compared to the Bluebirds pitcher who throws at 121 ft/sec

User Rawwar
by
5.6k points