174k views
5 votes
survey of 25 randomly selected customers found the ages shown (in years). The mean is 32.60 years and the standard deviation is 9.51 years. a) What is the standard error of the mean? b) How would the standard error change if the sample size had been 100 instead of 25 ? (Assume that the sample standard deviation didn't change.) a) The standard error of the mean is (Round to two decimal places as needed.) A. The standard error would increase. The new standard error would be times the old. B. The standard error would decrease. The new standard error would be the old standard error divided by C. The standard error would not change.

User Nefariis
by
7.8k points

1 Answer

5 votes

Final answer:

The standard error of the mean for a sample size of 25 with a standard deviation of 9.51 years is 1.90. If the sample size were increased to 100, the standard error of the mean would decrease to 0.95.

Step-by-step explanation:

The standard error of the mean (SEM) can be calculated using the sample standard deviation (s) and the sample size (n). The formula for the standard error of the mean is SEM = s / √n. For a sample size of 25 with a standard deviation of 9.51 years, the SEM is 9.51 / √25, which equals 1.90 when rounded to two decimal places (since √25 is 5).

If the sample size had been 100 instead, the SEM would decrease, because as the sample size increases, the standard error of the mean decreases. This is because SEM is inversely proportional to the square root of the sample size. Therefore, the new standard error would be the old standard error divided by √(100/25), which simplifies to √4 or 2. Hence, the new SEM would be 1.90 / 2, which equals 0.95.

User Brad Koch
by
7.6k points

No related questions found