129k views
3 votes
What is the average speed in miles per hour if you ran 100 yard in 9.0 s

User Gready
by
5.3k points

1 Answer

2 votes

Given:

Distance covered, d = 100 yards

Time, t = 9.0 seconds

Let's find the average speed in miles per hour.

To find the average speed, apply the formula:


\text{speed}=\frac{\text{distance}}{\text{time}}

Since, we are to find the average speed in miles per hour, we are to convert the distance from yards to miles and time from seconds to hours.

We have:

• Distance.

Where:

1 mile = 1760 yards


100\text{yards}=(100)/(1760)\text{miles=0.0568 miles}

• Time.

Where:

1 hour = 3600 seconds


9\text{seconds}=(9)/(3600)=0.0025\text{ hour}

The distance in miles is 0.0568 miles.

The time in hours is 0.0025 hour.

Hence, to find the average speed we have:


\text{speed}=\frac{\text{distance}}{\text{time}}=\frac{0.0568\text{ miles}}{0.0025\text{ hour}}=22.72\text{ mph}

Therefore, the average speed in miles per hour is 22.72 miles per hour.

ANSWER:

22.72 mph

User Jacquelin
by
5.4k points