191k views
5 votes
Sources A and B emit long-range radio waves of wavelength 360 m, with the phase of the emission from A ahead of that from source B by 90°. The distance rA from A to a detector is greater than the corresponding distance rB from B by 140 m. What is the magnitude of the phase difference at the detector?

User Stack
by
3.3k points

2 Answers

1 vote

Answer:

5.5859rad

Step-by-step explanation:

2pi/lambda(140)

=2pi/360x140

= 2.44rad

Thus total phase difference

Pi + 2.44rad

3.142+2.44

= 5.5859rad

User Bernard Moeskops
by
3.2k points
4 votes

Answer:

The magnitude of the phase difference at the detector = 0.8692 rad

Step-by-step explanation:

Given Data:

Wavelength (λ) = 360 m

Distance (x) = 140 m

The phase difference between the two phases can be calculated using the formula;

Ф = 2π/λ * x

= (2*π/360) * 140

=2.44 rad

Calculating the magnitude of the phase difference at the detector as;

ΔФ = 2.44 rad -90°(2π/360)

= 2.44 - 1.5708

= 0.8692 rad

User Gerges Eid
by
2.8k points