200k views
12 votes
Suppose the maximum safe intensity of microwaves for human exposure is taken to be 1.00 W/m2 . (a) If a radar unit leaks 10.0 W of microwaves (other than those sent by its antenna) uniformly in all directions, how far away must you be to be exposed to an intensity considered to be safe

1 Answer

10 votes

Answer:

We must be approximately at least 1.337 meters away to be exposed to an intensity considered to be safe.

Step-by-step explanation:

Let suppose that intensity is distributed uniformly in a spherical configuration. By dimensional analysis, we get that intensity is defined by:


I = (\dot W)/((4\pi)/(3)\cdot r^(3)) (1)

Where:


I - Intensity, measured in watts per square meter.


r - Radius, measured in meters.

If we know that
\dot W = 10\,W and
I = 1\,(W)/(m^(2)), then the radius is:


r^(3) = (\dot W)/((4\pi)/(3)\cdot I )


r = \sqrt[3]{(3\cdot \dot W)/(4\pi\cdot I) }


r = \sqrt[3]{(3\cdot (10\,W))/(4\pi\cdot \left(1\,(W)/(m^(2)) \right)) }


r \approx 1.337\,m

We must be approximately at least 1.337 meters away to be exposed to an intensity considered to be safe.

User XanderLynn
by
3.3k points