54.0k views
2 votes
suppose a random sample of 32 national football league players had an average weight of 252.5 pounds. assume that the standard deviation for weight of all players in the league is 31.1 pounds. determine the 95% confidence interval for this sample.

1 Answer

4 votes

Final answer:

The 95% confidence interval for the average weight of NFL players in the sample is (241.744, 263.256) pounds. The California quarterback was 1.53 standard deviations below the mean, which makes him lighter compared to the Texas player who was 0.70 standard deviations below his team's mean.

Step-by-step explanation:

Constructing a 95% Confidence Interval

To determine the 95% confidence interval for the average weight of NFL players using a sample mean of 252.5 pounds, a sample size of 32, and a population standard deviation of 31.1 pounds, we use the formula:

Z* (σ/√n)

Where Z is the z-score corresponding to a 95% confidence level (which is 1.96), σ is the population standard deviation, and n is the sample size.

So, the margin of error is:

1.96 * (31.1/√32) = 1.96 * 5.4875

= 10.756

Thus, the 95% confidence interval is:

(252.5 - 10.756, 252.5 + 10.756) = (241.744, 263.256)

Standard Deviations for Players

The team's most famous quarterback weighed 205 pounds. To find how many standard deviations this is from the mean, we calculate:

(X - μ) / σ = (205 - 252.5) / 31.1

= -47.5 / 31.1

= -1.53

The quarterback was 1.53 standard deviations below the mean.

For the Texas player who weighed 209 pounds, with a team mean of 240.08 and standard deviation of 44.38:

(X - μ) / σ = (209 - 240.08) / 44.38

= -31.08 / 44.38

= -0.70

The Texas player was 0.70 standard deviations below the mean for his team.

Comparison of the Players

Comparing the z-scores, the California quarterback was lighter as he was 1.53 standard deviations below the mean, compared to the Texas player who was only 0.70 standard deviations below the mean of his team.

User Lilster
by
7.8k points