219k views
4 votes
Light is a form of electromagnetic wave, containing oscillating electric and magnetic fields. The wave amplitude detector mentioned above shows how the electric field oscillates in time at the location of the probe. The amplitude of the wave at the location of the probe is equal to the maximum electric field measured. How does the amplitude of the wave depend on the distance from the source?

1 Answer

3 votes

Answer:

the decrease in intensity is due to the conservation of energy in the wavefront.

Step-by-step explanation:

Electromagnetic waves are transverse waves that comply with the principle of conservation of energy, these are formed by the variation of an electric and / or magnetic field and travels spherically from their point of origin.

By the principle of conservation of energy after the wave is emitted, the energy of it is distributed throughout the space, generally in spherical form. To conserve energy the density should decrease as the radius of the sphere increases, which is the inverse of the radius squared (1 / r²)

The previous decrease is observed in the decrease of the amplitude of the wave, since the intensity is the square of the electric field.

In summary, the decrease in intensity is due to the conservation of energy in the wavefront.

User TomLisankie
by
5.3k points