What is the minimum lens size (in meters) that will permit just resolving 2 stars 1.1 second apart using light of 510nm?

Limiting resolution (in radians)

= 1.22 (wavelength)/(diameter)

1.1 arc second = [1.1 sec/(3600 sec/deg*)]*1 rad/57.3 deg = 5.33*10^-6 rad

5.33*10^-6 = 1.22*510*10^-9/D

Solve for diameter D

Is D~11.7*10^-2

i solved D~11.7*10^-2, but the answer key is .0956

To determine the minimum lens size required to resolve two stars, we can use the concept of angular resolution. Angular resolution is defined as the minimum angle between two distinct points in an image that can be resolved by an optical instrument.

The formula for angular resolution is given by:

θ = 1.22 * (λ / D)

Where:
θ is the angular resolution,
λ is the wavelength of light,
D is the diameter of the lens or aperture.

In this case, we want to resolve two stars that are 1.1 seconds apart using light of 510nm (which is equivalent to 510 x 10^-9 meters).

Let's calculate the angular resolution:

θ = 1.22 * (λ / D)
θ = 1.22 * (510 x 10^-9 meters / D)

Since the angular resolution is given in seconds of arc, we need to convert 1.1 seconds to radians:

θ = (1.1 / 3600) * (π / 180)

Now we can solve for the diameter of the lens (D):

D = 1.22 * (λ / θ)
D = 1.22 * (510 x 10^-9 meters) / ((1.1 / 3600) * (π / 180))

By plugging in the values and performing the calculation, we can determine the minimum lens size required to resolve the two stars.

Note: It's important to mention that this calculation assumes perfect optical conditions and ignores other factors that can affect resolution, such as atmospheric turbulence.

Please note that the calculation involves several mathematical steps and is more easily done using a calculator or a computer program.