75.7k views
4 votes
Suppose the maximum safe intensity of microwaves for human exposure is taken to be 1.39~\mathrm{W/m^2}1.39 W/m ​2 ​​ . If a radar unit leaks 10.0~\text{W}10.0 W of microwaves (other than those sent by its antenna) uniformly in all directions, how far away must you be to be exposed to an intensity considered to be safe? Assume that the power spreads uniformly over the area of a sphere with no complications from absorption or reflection. (Note that early radar units leaked more than modern ones do. This caused identifiable health problems, such as cataracts, for people who worked near them.)

User Headshota
by
4.7k points

1 Answer

2 votes

Answer:

0.763 m

Step-by-step explanation:

Intensity I = power P ÷ area A of exposure (spherical area of propagation)

I = P/A

A = P/I

Power = 10.0 W

Intensity = 1.39 W/m^2

A = 10/1.39 = 7.19 m^2

Area A = 4¶r^2

7.19 = 4 x 3.142 x r^2

7.19 = 12.568r^2

r^2 = 7.19/12.568 = 0.57

r = 0.753 m

User ModX
by
4.1k points