174k views
3 votes
A plane wave with a wavelength of 500 nm is incident normally on a single slit with a width of 5.0 x 10–6 m. Consider waves that reach a point on a far-away screen such that rays from the slit make an angle of 1.0° with the normal. The difference in phase for waves from the top and bottom of the slit is: Group of answer choices 2.2 rad 1.1 rad 0.55 rad 0 rad 1.6 rad

2 Answers

4 votes

Answer: b, 1.1 rad

Step-by-step explanation:

We're going to use the relation between phase difference and path difference to solve this

Δx/λ = ΔΦ/2π, so that

Φ = 2πx / λ

Where

Φ = phase difference between two waves

x = path difference between the two waves

λ = the wavelength

Given

λ = 500 nm

θ = 1°

d = 5*10^-6 m

x = dsinθ, so that

Φ = 2πdsinθ / λ

Φ = [2 * π * 5*10^-6 * sin1°] / 500*10^-9

Φ = [3.142*10^-5 * 0.0175] / 500*10^-9

Φ = 5.483*10^-7 / 500*10^-9

Φ = 1.0966 rad

Φ = 1.1 rad

Therefore, the answer is B, 1.1 rad

User Madie
by
6.1k points
2 votes

Answer:

1.1 rad

Step-by-step explanation:

Let's use the difference in phase equation

Φ = 2πó / λ

Where

ó = distance between two points==> d sin∅


d = 5*10^-^6

∅ = 1°

λ = wavelength = 500nm
= 5*10^-^9

Therefore,

Φ = 2πó / λ

Φ = 2π dsin∅ / λ


= (2pie(5*10^-^6 sin1))/(5*10^-^9)

Ф = 1.096rad

= 1.1rad

User Sorina
by
5.3k points