176k views
1 vote
Suppose that the central peak of a single-slit diffraction pattern is so wide that the first minima can be assumed to occur at angular positions of ±90°. For this case, what is the ratio of the slit width to the wavelength of the light?

User Jonny Lin
by
7.6k points

1 Answer

0 votes

Final answer:

When the first minima of a single-slit diffraction pattern are at angles of ±90°, the ratio of the slit width to the wavelength of light is 1:1, because the angle means that the sine function in the diffraction equation simplifies to 1.

Step-by-step explanation:

If the first minima of a single-slit diffraction pattern occur at angles of ±90°, we can use the formula that relates slit width, wavelength, and the angle to the first minima in single-slit diffraction:

a sin(θ) = mλ, where a is the slit width, θ is the angle to the m-th minimum, and λ is the wavelength of the light.

For the first minima, m = 1. Given that θ is 90° (or π/2 radians), the sine of 90° is 1. Thus, the equation simplifies to a = λ. Therefore, the ratio of the slit width to the wavelength of the light is 1:1 when the first minima occur at ±90°.

User Kallz
by
7.7k points