16.8k views
1 vote
It is believed that the IQs of people follow a normal distribution with a mean of 100 points and a variance of 256 squared points. If many random samples of 100 persons are selected, (a) what would be the standard deviation of the sample mean IQ scores?

User Kasiriveni
by
7.3k points

1 Answer

5 votes

Final answer:

The standard deviation of the sample mean IQ scores is 1.6 points.

Step-by-step explanation:

To calculate the standard deviation of the sample mean IQ scores, we can use the formula √(variance / sample size). In this case, the variance is 256 squared points and the sample size is 100. Plugging in these values, we get √(256/100)=√2.56=1.6. Therefore, the standard deviation of the sample mean IQ scores is 1.6 points.

User Reinier
by
8.4k points