Answer:
Step-by-step explanation:
Given that,
Diffraction grating is ruled by 900 lines/cm
The detector length is
h= 1.2cm =1.2/100= 0.012m
Short wavelength is λs= 564.6nm
Long wavelength is λl = 571.2nm
Question: how far from the grating should the screen with the detector on it be placed. x=?
Since there are 900 lines per centimeter, each line is separated by 1/900 of a centimeter.
The distance between slits is d is given as
d = 1/900 cm
d = 0.00111cm
d = 0.0000111m = 1.11×10^-5 m
Let us call the two angles θs for short-wavelength (564.6nm) and θl for long-wavelength (571.2nm). Diffraction grating is given as
d sin(θs)= mλl
sin(θs)= mλl / d
Since we are given 7th order, then m=7
sin(θs)= 7×564.6×10^-9 / 1.11×10^-5
sin(θs)= 0.3557
θs = arcsin(0.3557)
θs = 20.84°
Applying the same principle for long wavelength
d sin(θl)= mλl
sin(θl)= mλl / d
Since we are given 7th order, then m=7
sin(θl)= 7×571.2×10^-9 / 1.11×10^-5
sin(θl)= 0.3599
θl = arcsin(0.3599)
θl = 21.1°
Using trigonometry,
Then, tanθ = opposite/adjacent
Adjacent is the distance between the screen and the detector
Opposite Is the length of detector, which serves like a height
Tanθl = h/x
x= h / Tan(θl)
x = 0.012 / Tan(21.1)
x = 0.0311m
x =3.11cm
Check attachment for better understanding the how I used the trigonometric function and the triangle I used.