3.6k views
0 votes
You are testing two scales for accuracy by weighing two different objects. Use the drop-down menus to complete the statements about the scales' accuracy.

Because the object on scale 1 weighs (less/more) than the object on scale 2, each pound of error results in a (lesser/greater) percent error. Scale 1 was (0/1/10/90/100) pounds off and scale 2 was (0/3/22/25/47) pounds off, so scale 1 was (3/10/20/90/100)% off and scale 2 was (3/10/12/50/88)% off.

1 Answer

6 votes

Final answer:

When weighing objects on different scales, the percent error can vary based on the weight difference and accuracy of the scales.

Step-by-step explanation:

When weighing two objects on different scales, the percent error can be calculated based on the weight difference between the objects and the accuracy of the scales.

If the object on scale 1 weighs less than the object on scale 2, each pound of error will result in a greater percent error.

Given that scale 1 was 10 pounds off and scale 2 was 25 pounds off, scale 1 was 20% off and scale 2 was 50% off.

User Adam Maass
by
7.2k points