Answer:
Accuracy and precision allow us to know how much we can rely on a measuring device readings. ±.001 as a "accuracy" claim is vague because there is no unit next to the figure and the claim fits better to the definition of precision.
Step-by-step explanation:
Accuracy and Precision: the golden couple.
Accuracy and precision are key elements to define if a measuring device is reliable or not for a specific task. Accuracy determines how close are the readings from the ideal/calculated values. On the other hand, precision refers to repeatability, that is to say how constant the readings of a device are when measuring the same element at different times. One of those two key concepts may not fulfill the criteria for measuring tool to be used on certain engineering projects where lack of accuracy (disntant values from real ones) or precision (not constant readings) may lead to malfunctons and severe delays on the project development.
±.001 what unit?
The manufacturer says that is an accuracy indicator, nevertheless there is now unit stated so this is not useful to see how accurate the device is. Additionally, That notation is more used to refer to device tolerances, that is to say the range of possible values the instrument may show when reading and element. It means it tells us more about the device precision during measurments than actual accuracy. I would recommend the following to the dial calipers manufacturers to better explain its measurement specifications:
- Use ±.001 as a reference for precision. It is important to add the respective unit for that figure.
- Condcut test to define the actual accuracy value an present it using one of the common used units for that: Error percentage or ppm.