66.5k views
2 votes
X-rays with a wavelength of 1.70 Å scatter at an angle of 36.5 ∘ from a crystal. If n=1, what is the distance between planes of atoms in the crystal that give rise to this scattering?

A) 1.70 Å
B) 2.34 Å
C) 3.11 Å
D) 4.68 Å

1 Answer

2 votes

Final answer:

Using Bragg's Law, nλ = 2d sin(θ), and given values, the distance between planes of atoms in the crystal causing the X-ray scattering is 2.34 Å, which is option B.

Step-by-step explanation:

The question involves finding the distance between planes of atoms in a crystal that causes X-rays to scatter at a given angle. To find this distance, we can use Bragg's Law, which states that nλ = 2d sin(θ), where n is the order of diffraction, λ is the wavelength of the X-rays, d is the distance between the crystal planes, and θ is the scattering angle. Given that n = 1, λ = 1.70 Å (where 1 Å = 0.1 nm), and θ = 36.5 degrees, we can rearrange the formula to solve for d: d = λ / (2 sin(θ)). Plugging in the values, we get d = 1.70 Å / (2 sin(36.5 degrees)) = 2.34 Å, which corresponds to option B) 2.34 Å.

User Edward Hartnett
by
7.8k points