188k views
4 votes
The actual time it takes to cook a 10 pound turkey is a normally distributed. Suppose that a random sample of 10 pound turkey’s is taken.

Given that an average of 2.9 hours and a standard deviation of .24 hours was found for a sample of 19 turkeys, calculate a 95% confidence interval for the average cooking time of a 10 pound turkey.

1 Answer

3 votes

Answer:

The interval is (2.7512, 2.8488)

Explanation:

Given that:

The mean (μ) = 2.9 hours

The standard deviation (σ) = 0.24 hours

n = 14 pound turkey

The confidence interval (c) = 95% = 0.95

α = 1 - 0.95 = 0.05


(\alpha )/(2) = (0.05)/(2) = 0.025

The Z score of
(\alpha )/(2) is the z score of 0.025 which is the same z score of 0.475 (0.5 - 0.025).


Z_{(\alpha )/(2) }=1.96


\mu_x=\mu=2.9


\sigma_x=(\sigma)/(√(n) )=(0.24)/(√(10) ) =0.07589

Therefore the margin of error E is given as

E =
Z_{(\alpha )/(2) } *\sigma_x = 1.96 × 0.0759 = 0.1488

The interval is
(\mu_x-E,\mu_x+E) = (2.9 - 0.1488, 2.9 + 0.1488) = (2.7512, 2.8488)

The interval is (2.7512, 2.8488)

User Vladimir Nani
by
5.9k points