44.0k views
3 votes
A thin uniform film of refractive index 1.750 is placed on a sheet of glass with a refractive index 1.50. At room temperature ( 18.6 ∘C), this film is just thick enough for light with a wavelength 582.3 nm reflected off the top of the film to be canceled by light reflected from the top of the glass. After the glass is placed in an oven and slowly heated to 177 ∘C, you find that the film cancels reflected light with a wavelength 587.2 nm What is the coefficient of linear expansion of the film? (Ignore any changes in the refractive index of the film due to the temperature change.) Express your answer using two significant figures.

1 Answer

1 vote

Answer:


\alpha = 5.2 * 10^(-5) per ^oC

Step-by-step explanation:

At room temperature the thickness of the film is just enough that it will cancel the reflected light

So here we will have


2\mu t = (\lambda)/(2)

so we have


t = (\lambda)/(4/mu)

here we have


\lambda = 582.3 nm


\mu = 1.750

now we have


t = (582.3 nm)/(4(1.750))


t = 83.2 nm

now when temperature is increased to 177 degree C

then in this case we have


\lambda_2 = 587.2 nm

now we have


t' = (587.2 nm)/(4(1.750))


t' = 83.89 nm

now we know by the formula of thermal expansion


t' = t (1 + \alpha \Delta T)


83.89 = 83.2(1 + \alpha(177 - 18.6))


\alpha = 5.2 * 10^(-5) per ^oC

User Agmin
by
6.2k points