Final answer:
The power saved by stepping up the voltage from 180 V to 1800 V, instead of transmitting at 180 V, is closest to a 98% reduction in power loss, since power loss is proportional to the square of the current and stepping up the voltage decreases the current by a factor of 10.
Step-by-step explanation:
The subject of this question is related to the concept of power transmission and the use of transformers in electricity distribution. When transmitting electricity over long distances, it is beneficial to transmit at high voltage to reduce power loss. Using a step-up transformer, the voltage is increased, which decreases the current through the transmission lines, resulting in less power loss due to the resistance of the wires.
Power loss in transmission lines is proportional to the square of the current (P = I^2R). By stepping up the voltage to 1800 V from 180 V, we decrease the current by a factor of 10 (since P = VI, and power transmitted is constant). Thus, if power loss is proportional to the square of the current, reducing the current by a factor of 10 reduces the power loss by a factor of 100. Hence, if originally 100% of the power was lost, only 1% would be lost now, leading to a 99% reduction in power loss. However, since none of the provided options matches this result, and considering real-life inefficiencies and rounding, the closest and more realistic option is (D) 98%.