102k views
0 votes
The length of needles produced by a machine has standard deviation of 0.04 inches. Assuming that the distribution is normal, how large a sample is needed to determine with a precision of ±0.005 inches the mean length of the produced needles to 98% confidence?

1 Answer

1 vote

Answer:

The sample size is
n = 87

Explanation:

From the question we are told that

The standard deviation is
\sigma = 0.04 \ inches

The precision is
d = \pm 0.005 \ inches

The confidence level is
C =98%

Generally the sample size is mathematically represented as


n = \frac{ Z_{(\alpha )/(2) } ^2* \alpha^2 }{d^2}

Where
\alpha is the level of significance which is mathematically evaluated as


\alpha = 100 - 98


\alpha = 2%


\alpha = 0.02

and
Z_{(\alpha )/(2) } is the critical value of
\alpha which is obtained from the normal distribution table as 2.326

substituting values


n = (2.326 ^2* 0.02^2 )/(0.005^2)


n = 87

User Jonathan Chiu
by
5.0k points