Final answer:
To find your rate of change, you divide the distance run (10 miles) by the time in hours (1.333 hours), resulting in an average speed of approximately 7.5 miles per hour.
Step-by-step explanation:
Calculating Rate of Change
If you can run 10 miles in one hour and 20 minutes, we need to calculate your speed or rate of change in miles per hour. One hour and 20 minutes is the same as 80 minutes, which we can then convert into hours by dividing by 60. The calculation is 10 miles ÷ (80 minutes ÷ 60 minutes per hour), which simplifies to 10 miles ÷ 1.333 hours, giving us a speed of approximately 7.5 miles per hour.
Using this rate of change, we can compare other examples. For instance, if 40 percent of runners run at speeds of 7.5 miles per hour or less, then you are at the threshold of this group. A marathon runner with an average speed of 9.5 miles per hour could complete a 26.22-mile marathon in about 2 hours and 45 minutes. If a runner covers distances of 9.5, 8.89, and 2.333 miles in varying directions, the total distance using correct significant figures would be 20.7 miles (20.72 rounded to the nearest tenth).