135k views
5 votes
One argument against the American standard system of measurement is a lesser degree of accuracy than that of the metric system. Explain how the standard side of a ruler with inches broken down to 1/16th of an inch would have less accuracy than the metric side of the same ruler with each centimeter broken down to millimeters?

User Lovro
by
6.7k points

2 Answers

2 votes
1/16 inches will be 0.0625 inches
1 cm = 10mm or 1mm = 0.1cm

The difference in the 0.0625 and 0.1 is that the 0.0625 will be harder to partition than the 0.1
User Zazaeil
by
6.1k points
7 votes
In this item, we are to divide one (1) inch into 16 equal parts to be able to conclude that American standard system is less accurate than the metric system.
16 parts of 1 inch = 1 inch / 16 = 0.0625 inch

This in itself is already a proof enough that there are 4 decimal places after the decimal point which makes this hard to properly measure.

Whereas, if we are to divide a centimeter into the millimeters, we will have 10 equal parts and that would be 1 mm = 0.1 cm.

The number above only has one decimal place and the level of accuracy should be higher.
User Peter Badida
by
6.3k points