156k views
5 votes
Bev weighs a bag of apples labeled 5 pounds and finds that the weight is actually 72 ounces. To the nearest percent, what is the percent error in the weight? (1 pound 5 16 ounces)

User Wololo
by
6.4k points

1 Answer

2 votes

Answer: 10% is the percentage error.

Explanation:

Since, According to the question the labeled weight of the a bag of apples = 5 pounds

Since, 1 pounds = 16 ounces

Therefore the labeled weight of the a bag of apples= 5 × 16 = 80 ounces

But, the actual weight of the bag of apples = 72 ounces

Since, The percentage error = ( experiment value - actual value)×100 /experiment value

Thus, percentage error of the weight of the bag of apple= (80-72)×100/80= 800/80= 10%


User Apostlion
by
7.2k points