Final answer:
To determine the error in measuring a distance at a higher temperature using an iron rod, we calculate the rod's expansion due to increased temperature and then total it up for the entire distance measured. The total error in measuring the 2 km with the iron rods at 40°C would be 0.72 meters.
Step-by-step explanation:
To calculate the error in measuring the distance due to the thermal expansion of an iron rod, we can use the formula for linear expansion, ΔL = αLΔT, where ΔL is the change in length, α is the linear expansivity of the material, L is the original length, and ΔT is the change in temperature.
The original length of the iron rod (L) is 100 meters, and the change in temperature (ΔT) is 40°C - 10°C = 30°C. The linear expansivity of iron (α) is given as 1.2×10⁻⁵/°C. Plugging these values into the formula, we find the change in length (ΔL) of one rod and then multiply it by the number of rods used to measure the 2 km distance.
ΔL = (1.2×10⁻⁵/°C)*(100 m)*(30°C) = 0.036 m per rod.
Since 2 km is 2000 meters and each rod is 100 meters, we would use 20 rods to measure 2 km. Therefore, the total error in measuring the 2 km distance would be 20 rods × 0.036 m/rod = 0.72 m.
The error in the measurement is an expansion, meaning the measured distance would be greater than the actual distance by 0.72 meters.