67.2k views
0 votes
A sample of n=100 is selected from a population with a mean of u=70. If the sample standard deviation is s=20, how much difference is expected, on average, between the sample mean and the population mean? In other words what is the estimated standard error for the sample mean?

1 Answer

5 votes

Final answer:

The standard error of the mean for a sample size of 100 with a sample standard deviation of 20 is 2. This value reflects the expected average difference between the sample mean and the population mean.

Step-by-step explanation:

You're working with a concept called the standard error of the mean, which is a measure of the amount of variation you can expect in the sample mean from the population mean. The formula for the standard error of the mean (SEM) is the sample standard deviation (s) divided by the square root of the sample size (n).

Given that the sample standard deviation is s = 20 and the sample size is n = 100, the standard error of the mean can be calculated as:

SEM = s / √n

SEM = 20 / √100

SEM = 20 / 10

SEM = 2

Therefore, we can expect, on average, a difference of 2 between the sample mean and the population mean.

User Bernhard Pointner
by
7.9k points