206k views
4 votes
A sinusoidal electromagnetic wave has an average intensity of I1 =100 W/m2. By what factor would the electric-field amplitude of the wave have to be increased in order for the wave to have an average intensity of I2 = 8100 W/m2 ?

User TanThien
by
3.1k points

1 Answer

5 votes

Answer:

9

Step-by-step explanation:

Given,

Intensity,I₁ = 100 W/m²

Intensity, I₂ = 8100 W/m²

We know,

Intensity is directly proportional to Square of Amplitude.


(A_2)/(A_1)=\sqrt{(I_2)/(I_1)}


(A_2)/(A_1)=\sqrt{(8100)/(100)}


(A_2)/(A_1)=√(81)


(A_2)/(A_1)=9

Factor the electric-field amplitude increased is equal by factor 9.

User Snehal Parmar
by
2.7k points