Final answer:
The average time to run a mile varies by age and fitness level, with school classes having different mean times. Professional runners have their own set of statistics, like marathon runners averaging 9.5 miles per hour. Understanding statistical measures such as standard deviation, percentiles, and means is key to analyzing running performance.
Step-by-step explanation:
The average time to run a mile can differ based on various factors such as age, sex, and fitness level. It's important to consider these factors when discussing average times. For instance, an elementary school class had a mean time of 11 minutes with a standard deviation of 3 minutes, while a junior high class had a mean time of 9 minutes with a standard deviation of 2 minutes, and a high school class had a mean time of 7 minutes with a standard deviation of 4 minutes. These statistics indicate that average times may decrease as students get older and presumably more physically developed.
When looking at marathon runners, we see different statistics. For example, if a runner averages 9.5 miles per hour, they would finish a 26.22-mile marathon in roughly 2 hours and 45 minutes. Moreover, we can use statistics like the standard deviation to understand variability in race times. A Bay-to-Breakers runner with race times varying by at most 3 minutes indicates low variability in her performance.
Understanding statistics in the context of running can also help one run hypothesis tests, determine percentiles, and calculate medians of average running times. Analyzing the performance of runners, like Usain Bolt, who set a world record with a time of 9.58 seconds for the 100-meter dash, averaging 23.4 miles per hour, shows the speed at which elite athletes operate.