22.1k views
5 votes
A psychologist estimates the standard deviation of a driver's reaction time to be 0.05 seconds. How large a sample of measurements must be taken to derive a confidence interval for the mean with a margin of error at most 0.01 second, and confidence level 95%?

User HepaKKes
by
7.9k points

1 Answer

3 votes

Answer: 97

Explanation:

The formula to find the sample size:-


n=((z^*\cdot \sigma)/(E))^2 , where
\sigma = prior standard deviation., z^*= Critical value corresponds to the confidence level and E is margin of error .

Given : A psychologist estimates the standard deviation of a driver's reaction time to be 0.05 seconds.

i.e.
\sigma=0.05

E= 0.01

Critical value for 95% confidence interval = 1.96

Then, the required sample size will be


n=((1.96*0.05)/(0.01))^2\\\\n=(1.96*5)^2\\\\ n=9.8^2=96.04\approx97 [Round to the nest integer.]

Hence, the required minimum sample size = 97

User Inputforcolor
by
8.7k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories