38.6k views
3 votes
For a particular consumer product, the mean dollar sales per retail outlet last year in a sample of n =10 stores was x =$3,425 with s = $200. The sales amounts per outlet are assumed to be normally distributed. Estimate the standard deviation of dollar sales of this product in all stores last year, using a 90 percent confidence interval.____

User Carman
by
7.7k points

1 Answer

5 votes

Final answer:

To estimate the standard deviation of dollar sales of the product in all stores, construct a confidence interval using the sample mean and standard deviation. The 90% confidence interval is $3,321.74 to $3,528.26.

Step-by-step explanation:

To estimate the standard deviation of dollar sales of the product in all stores last year, we can construct a confidence interval using the sample mean and sample standard deviation. With a 90 percent confidence interval, the critical value is 1.645.

The formula for the confidence interval is: CI = x ± (critical value)*(s/√n)

Plugging in the values, the confidence interval is: $3,425 ± (1.645)*($200/√10) = $3,425 ± $103.26

So, the 90 percent confidence interval for the standard deviation of dollar sales is approximately $3,321.74 to $3,528.26.

User Gameveloster
by
7.4k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories