180k views
3 votes
Judy thinks there will be 325 people at the county fair on Friday while Atticus thinks there will be 600 people. On Friday 452 people attended the fair. What is the difference between the percent errors. Please help

User Nyna
by
3.5k points

1 Answer

1 vote

The observation of Judy is closer to the actual number and the difference is 4.7%

Explanation:

Taking the mean of both the assumptions of Judy and Atticus

We get = (325 + 600)/2

= 925/2

= 462.5

we see that the actual number of people who attended the fair is less than 462.5 > 452, therefore the lower number that was taken for mean will be closer.

therefore, the observation of Judy is closer to the actual number.

% error = % error of Atticus - % error of Judy

% error = (assumption - actual)/actual x 100

% error for Judy = 127/452 x 100 = 12700 / 452 = 28 %

% error for Atticus = 148/452 x 100 = 14800/452 = 32.7%

Difference = 4.7%

User Harsh Kumar Narula
by
4.4k points