Final answer:
The uncertainty of ±0.2°C in a thermometer tells us that the actual temperature could be either 0.2°C more or less than what is read. This small uncertainty is important for accurate fever detection. Calculating percent uncertainty involves dividing the uncertainty by the measured value and multiplying by 100.
Step-by-step explanation:
The uncertainty for a thermometer with an error of ±0.2°C indicates that any temperature measurement could be 0.2°C higher or lower than the displayed reading.
To understand the real-world implications of this, consider taking the temperature of a sick child. If the thermometer's uncertainty was as high as 3.0°C instead of 0.2°C, it would be hard to tell whether a 37.0°C reading indicates normal body temperature or something more serious, since the true temperature could range from 34.0°C to 40.0°C.
This variability could mean the difference between a healthy child and one experiencing dangerous fevers or chills. Uncertainty can also be calculated as a percentage; for example, a measuring tape with an error of 0.50 cm over a distance of 20 m would have a percent uncertainty calculated by dividing the absolute uncertainty by the total measured value and then multiplying by 100.