Answer:
The distance between interference fringes increases.
Step-by-step explanation:
In a double-slit diffraction pattern, the distance of the n-order fringe from the centre of the pattern is
![y=(n \lambda D)/(d)](https://img.qammunity.org/2020/formulas/physics/high-school/7z7q8wglsg4cltkcv6zb3qa5ulgw0z70bx.png)
where
is the wavelength of the light, D the distance of the screen, and d the separation between the slits.
If we take two adjacent fringes, n and (n+1), their distance is
![\Delta y = ((n+1)\lambda D)/(d)-(n\lambda D)/(d)=(\lambda D)/(d)](https://img.qammunity.org/2020/formulas/physics/high-school/akczkw7ig85vnulu90osyy3tf32wfn6t3q.png)
so, we see that it is inversely proportional to the slit separation, d.
Therefore, if the separation between the slits decreases, the distance between the interference fringes increases.