Xrays with a wavelenth of 1.70A scatter at an angle of 21.5 degrees from a crystal. If n=1, what is the distance between planes of atoms in the crystal that give rise to this scatterying?

So far I've gotten .367 radians but I have no idea where to go from there.

n*lambda = 2d*sin theta.

solve for d.

To solve this problem, you need to use Bragg's Law, which relates the scattering angle, the wavelength of the X-rays, and the interatomic distance in the crystal.

Bragg's Law is expressed as:

n * λ = 2 * d * sin(θ)

Where:
- n is the order of the scattering (usually 1)
- λ is the wavelength of the X-rays
- d is the interatomic distance (the distance between planes of atoms)
- θ is the scattering angle

In your problem, you are given:
- n = 1
- λ = 1.70 Å (Angstroms) = 1.70 * 10^(-10) m (convert to meters)
- θ = 21.5° (degrees) = 21.5 * π / 180 radians (convert to radians)

First, let's convert the wavelength from Angstroms to meters:
1.70 Å = 1.70 * 10^(-10) m

Now, let's convert the scattering angle from degrees to radians:
θ = 21.5 * π / 180 radians

Now, we can rearrange Bragg's Law to solve for d:
d = (n * λ) / (2 * sin(θ))

Plug in the values:
d = (1 * 1.70 * 10^(-10) m) / (2 * sin(21.5 * π / 180))

Evaluate the right-hand side expression:
d ≈ 3.105 * 10^(-10) m

Therefore, the distance between planes of atoms in the crystal that gives rise to this scattering is approximately 3.105 * 10^(-10) meters.