Final answer:
To read a multimeter set to megohms, interpret the reading as millions of ohms. Isolate the component for accurate resistance measurement using Ohm's law. Incorrect mode settings on the multimeter can affect the circuit or damage the meter.
Step-by-step explanation:
If an older multimeter is set to display its reading in megohms, you need to ensure that the reading you are getting is scaled correctly. When a measurement is taken in megohms, it means that the resistance being measured is in the range of millions of ohms. To determine the correct value, you simply read off the measurement and interpret it as megohms (1 megohm = 1,000,000 ohms). For example, if the multimeter displays 2 megohms, this is equivalent to 2,000,000 ohms.
To measure resistance accurately with an ohmmeter mode of a multimeter, it is important that the component being tested is isolated from the circuit. This ensures that the current from the multimeter's internal voltage source flows solely through the component of interest, allowing for a correct measurement of its resistance based on Ohm's law (R = V/I).
It is also crucial to remember that if you are measuring current and accidentally leave the multimeter in voltmeter mode, the high resistance of the voltmeter can affect the circuit's performance. Similarly, if measuring voltage and the meter is set to ammeter mode, the low resistance expected of an ammeter can lead to potential damage, as ammeters are designed to have very low resistance and are normally protected by a fuse.