205k views
0 votes
A real double slit, consisting of two slits with finite widths, produces a diffraction pattern that is a combination of the interference due to a single slit and the interference due to an idealized double-slit (consisting of two slits of infinitesimal width). Find the ratio of the width of the slits to the separation between them that will produce an interference pattern such that the the first minimum of the single-slit pattern falls on the fifth maximum of the idealized double-slit pattern. (This will greatly reduce the intensity of the fifth maximum in the real double slit interference pattern.)

User Andika
by
7.6k points

1 Answer

4 votes

Answer:

Step-by-step explanation:

Let the slit width be a , slit separation be d and screen distance be D . Let the wavelength of light used be λ .

For diffraction through single slit

position of first minima from central fringe

x = λ (D / a)

For interference pattern

position of 5 th maxima from central fringe

x = 5 λ (D / d )

Since these positions are equal

λ (D / a) = 5 λ (D / d )

5 a = d

a / d = 1 / 5 = . 2

User Heyjinkim
by
7.0k points