16.7k views
4 votes
An 800-kHz radio signal is detected at a point 8.5 km distant from a transmitter tower. The electric field amplitude of the signal at that point is 0.90 V/m. Assume that the signal power is radiated uniformly in all directions and that radio waves incident upon the ground are completely absorbed. What is the average electromagnetic energy density at that point

2 Answers

3 votes

Answer:

4.96*10^-20 J

Step-by-step explanation:

To find the average electromagnetic energy of the wave you use the fact that the wave can be taken as a spherical electromagnetic wave. IN this case, the amplitude of the wave change in space according to:


|E|=(E_o)/(r)

for a distance of 8.5km = 5.5*10^3m you have:


|E|=(0.90\ V/m)/(8.5*10^3m)=1.058*10^(-4)\ V/m

Next, to find the average energy at this point you use the following formula:


u_E=(1)/(2)\epsilon_0|E|^2

εo : dielectric permittivity of vacuum = 8.85*10^-12C^2/Nm^2


u_E=(1)/(2)(8.85*10^(-12))|1.058*10^(-4)|^2\ J=4.96*10^(-20)J

hence, the average energy of the wave is 4.96*10^-20 J

User Mike Chan
by
4.6k points
2 votes

Answer:

Step-by-step explanation:

Given that,

Frequency of radio signal is

f = 800kHz = 800,000 Hz.

Distance from transmitter

d = 8.5km = 8500m

Electric field amplitude

E = 0.9 V/m

The average energy density can be calculated using

U_E = ½•ϵo•E²

Where ϵo = 8.85 × 10^-12 F/m

Then,

U_E = ½ × 8.85 × 10^-12 × 0.9²

U_E = 3.58 × 10^-12 J/m²

The average electromagnetic energy density is 3.58 × 10^-12 J/m²

User Ala Tarighati
by
3.8k points