7.9k views
5 votes
In a study of the accuracy of fast food​ drive-through orders, one restaurant had 36 orders that were not accurate among 322 orders observed. Use a 0.01 significance level to test the claim that the rate of inaccurate orders is equal to​ 10%. Does the accuracy rate appear to be​ acceptable?

User Rontron
by
7.3k points

1 Answer

4 votes

Answer:

Explanation:

Given that in a study of the accuracy of fast food​ drive-through orders, one restaurant had 36 orders that were not accurate among 322 orders observed.

Sample proportion = 36/322= 0.112

Hypotheses would be:


H_0:p=0.10\\H_a: p\\eq 0.10

(two tailed test at 1% significance level)

p difference = 0.012

Std error =
\sqrt{(0.1*0.9)/(322) } \\=0.017

Test statistic =
(0.012)/(0.017) \\=0.706

p value = 0.48

since p >0.01 we accept null hypothesis

The accuracy rate appears to be acceptable as only 10% are not accurate.

User Sjors Ottjes
by
7.7k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories