72.2k views
3 votes
A researcher has obtained a random sample of 100 patients. The average length of time it took these 100 patients to fill out a questionnaire was 3.0 minutes. If the standard deviation of the population of completion times is one minute, then the standard error of the mean equals a. 0.001 b. 0.010 c. 0.100 d. 1.000

User Polerto
by
8.9k points

1 Answer

3 votes

Final answer:

The standard error of the mean for the given sample of 100 patients, with a population standard deviation of one minute, is 0.1 minutes.

Step-by-step explanation:

The researcher wants to determine the standard error of the mean (SEM) for the time it took a sample of 100 patients to complete a questionnaire. The population standard deviation (σ) is given as one minute. To calculate the SEM, we use the formula SEM = σ / √n, where n is the sample size.

Plugging in the numbers, we get SEM = 1 / √100 = 1 / 10 = 0.1 minutes. Therefore, the standard error of the mean in this case is 0.1 minutes, which matches option c.

The standard error of the mean is equal to the population standard deviation divided by the square root of the sample size. In this case, the population standard deviation is one minute and the sample size is 100. So, the standard error of the mean (SE) is 1/√100 = 1/10 = 0.1 minutes.

The answer is c. 0.100

User Zoidbergseasharp
by
8.2k points

No related questions found