9.3k views
2 votes
Suppose it is desired to estimate the average time a customer spends in Dollar Tree to within 5 minutes at 99% reliability. It is estimated that the standard deviation of the times is 15 minutes. How large a sample should be taken to get the desired interval?

User Mockee
by
4.3k points

1 Answer

4 votes

Answer:

Im not sure how to explain this but text me so i can fully explain it to where you understand

Explanation:

User Andrei Karcheuski
by
4.8k points