34.7k views
5 votes
If a transmitter uses a signal power of 2 Watts that can be reliably received within a distance of up to 3miles, what is the lowest amount of signal power that would be required for a similar signal to be received within a distance of 15 miles?

User Jderda
by
6.3k points

1 Answer

1 vote

Answer:


50 Watt

Step-by-step explanation:


P_(1) = Initial signal power = 2 Watts


r_(1) = Initial range of distance to which signal is received = 3 miles


P_(2) = Final signal power = ?


r_(2) = Final range of distance to which signal is received = 15 miles

Intensity is inversely proportional to square of the distance from the source, hence


\frac{P_(1)}{{r_(1)}^(2)} = \frac{P_(2)}{{r_(2)}^(2)}


\frac{2}{{3}^(2)} = \frac{P_(2)}{{15}^(2)}


(2)/(9) = (P_(2))/(225)


P_(2) = (2* 225)/(9)


P_(2) = 50 Watt

User Amal Ajith
by
6.5k points