221k views
1 vote
When dots are placed on a page from a laser printer, they must be close enough so that you do not see the individual dots of ink. To do this, the separation of the dots must be less than Raleigh's criterion. Take the pupil of the eye to be 3.2 mm and the distance from the paper to the eye of 42 cm; find the maximum separation (in cm) of two dots such that they cannot be resolved. (Assume the average wavelength of visible light is 550 nm.)

User FeRtoll
by
5.2k points

1 Answer

5 votes

Answer:

y <8 10⁻⁶ m

Step-by-step explanation:

For this exercise, they indicate that we use the Raleigh criterion that establishes that two luminous objects are separated when the maximum diffraction of one of them coincides with the first minimum of the other.

Therefore the diffraction equation for slits with m = 1 remains

a sin θ = λ

in general these experiments occur for oblique angles so

sin θ = θ

θ = λ / a

in the case of circular openings we must use polar coordinates to solve the problem, the solution includes a numerical constant

θ = 1.22 λ / a

The angles in these measurements are taken in radians, therefore

θ = s / R

as the angle is small the arc approaches the distance s = y

y / R = 1.22 λ / s

y = 1.22 λ R / a

let's calculate

y = 1.22 500 10⁻⁹ 0.42 / 0.032

y = 8 10⁻⁶ m

with this separation the points are resolved according to the Raleigh criterion, so that it is not resolved (separated)

y <8 10⁻⁶ m

User Bill Tarbell
by
5.3k points