168k views
3 votes
A silicon wafer is used to attenuate the intensity from a laser emitting at a wavelength of 0.8 µm. If the laser output power is 100 mW what is the wafer thickness required to attenuate the power to 1 mW?

1 Answer

4 votes

Answer:


t =4.605 *10^(-3)

Step-by-step explanation:

given data:

wavelength of emission =
0.8 \mu m

output power = 100 mW

We can deduce value of obsorption coefficient from the graph obsorption coefficient vs wavelength

for wavelength
0.8 \mu m the obsorption coefficient value is 10^{3}

intensity can be expressed as a function of thickness as following:


I(t) = I_(O) *e^(-\lambda *t)

putting all value to get thickness


1*10^(-3) =100*10^(-3)e^{-10^(3)*t}


0.01 = e^{10^(3)t}


t =4.605 *10^(-3)

User Nathan Kleyn
by
5.5k points