Final answer:
To determine the incorrect boiling point of water based on a miscalibrated thermometer, we can extrapolate using the known error at the melting point of ice and at another point at 60°C. We find that the thermometer will show an incorrect reading for the boiling point after accounting for the linear calibration error.
Step-by-step explanation:
The question is about a thermometer with incorrect calibration, showing the melting point of ice as -5°C and reading 70°C instead of 60°C. The task is to determine the boiling point of water on this incorrect scale. To find this, we need to understand the relationship between the actual temperature scale and the miscalibrated scale. We know that the real melting point of water is 0°C and its boiling point is 100°C on the Celsius scale. If the thermometer underreads by 5°C at 0°C and overreads by 10°C at 60°C, we can infer a linear error in calibration.
By setting up a proportion based on the calibration error, we calculate the difference between actual and observed temperatures at freezing and another known point, which in this case, allows us to extrapolate the boiling point. The corrected boiling temperature (B_corr) is determined using the following relationship:
B_corr = B_obs − [(B_obs − F_obs) × (T_real − F_real) / (T_obs − F_obs)], where F_obs is the observed freezing point, B_obs is the observed boiling point, F_real is the real freezing point (0°C), T_real is the real temperature at a given observed temperature (60° for 70°C), and T_obs is that given observed temperature. Substituting the given values, B_corr = 100°C − [(70°C − (-5)°C) × (60°C − 0°C) / (70°C − (-5)°C)] which gives us the corrected boiling point of water on the miscalibrated scale.