169k views
3 votes
Use the Central Limit Theorem to find the mean and standard error of the mean of the indicated sampling distribution.

The amounts of time employees of a telecommunications company have worked for the company are normally distributed with a mean of 5.20 years and a standard deviation of 2.10 years. Random samples of size 15 are drawn from the population and the mean of each sample is determined. Round the answers to the nearest hundredth.

1 Answer

0 votes

Central Limit Theorem yields 5.20-year mean and 0.54-year standard error for employee tenure samples.

The Central Limit Theorem tells us that the sampling distribution of means from a normally distributed population will also be normally distributed, regardless of the original population size.

Therefore, the mean of the sampling distribution of means will be equal
√(15) to the population mean, which is 5.20 years.

The standard error of the mean (SEM), representing the standard deviation of the sampling distribution, can be calculated using the formula:

SEM = population standard deviation / square root of sample size

In this case, SEM = 2.10 years / ≈ 0.54 years (rounded to the nearest hundredth).

Therefore, the mean of the sampling distribution is 5.20 years and the standard error of the mean is 0.54 years. This means that we can expect most sample means to fall within the range of 5.20 ± 0.54 years, or approximately between 4.66 and 5.74 years.

User Jamie Ronin
by
8.1k points