158k views
2 votes
A double-slit diffraction pattern is formed on a distant screen. If the separation between the slits decreases, what happens to the distance between interference fringes? Assume the angles involved remain small. A double-slit diffraction pattern is formed on a distant screen. If the separation between the slits decreases, what happens to the distance between interference fringes? Assume the angles involved remain small. The distance between interference fringes remains the same. The distance between interference fringes also decreases. The distance between interference fringes increases. The effect cannot be determined unless the distance between the slits and the screen is known.

User Yeahumok
by
4.9k points

1 Answer

7 votes

Answer:

Option C. The distance between interference fringes increases.

Step-by-step explanation:

The distance between interference fringes for small angles can be given by the formula:


y = (\lambda D)/(d)..........(1)

Where D = Distance between the slits and the screen


\lambda = the wavelength of light

d = separation between the two slits

from the formula given in equation (1)


y \alpha 1/d

It is obvious from the relationship above that the distance between interference fringes is inversely proportional to the separation between the slits.

Therefore, if the separation between slits is increased, the distance between interference fringes is increased.

User Sujee Maniyam
by
4.8k points