Recently there has been technological advances that allow for 3-D imaging up to kilometers away from an object. This can be done by using laser light that bounces off an object.

By measuring the time it takes the laser light to get to the object and back, one can construct a detailed 3-d image of the object. The obstruction has been to get enough of the light back to actually make an image, but new detectors are sensitive enough to overcome this limitation.

If I want to measure a 3-d object down to millimeter accuracy, to what accuracy in seconds must my detector be able to measure the time of arrival of the light that bounced off the object?

Remember...the light has to bounce there and back.

I am not an expert. I am a very rusty amateur, so this has to be checked. But I think the difference in light travel distance you are talking about measuring is 2mm. That is, 1mm out into some 1mm deep feature in the object, and 1 mm back. Plus the 2 km there and back. But forget the 2 km. You are interested in just the difference in object depth: the 2mm.

In a vacuum, light travels at 299792458 m/s. That is 299792458000 mm/s.

It takes light 1/299792458000 mm/s = 3.3356409519815203*10^-12 sec to travel 1mm. Multiply that by 2:

6.6712819039630406*10^-12 = s to go 2 mm

So I think to get millimeter accuracy, your detector would have to be able to measure a difference of 6.671*10^-12 s in light travel time. And that is in space.

In air, light travels less quickly. I think you divide the speed of light by the index of refraction for air to get the speed of light in air. And at such a small fraction of a second, the "seeing" would have to be really good. That is, over a distance of 2 km there are normally variations in air temperature. That means the air is thicker or thinner in places. Light travels faster through thinner air than thick air. So, unless the "seeing" is really good, the light detector will "see" a "swimming" image instead of a sharp, steady image.

josh,has answer the question for u,

another mothed is 0.002/3*10^8,since u took it from brilliant,

giving u an answer of 6.66*10^-12,

To measure a 3D object down to millimeter accuracy, you would need to be able to measure the time of arrival of the light that bounced off the object with a high level of precision. Let's break down the calculation to determine the required accuracy in seconds.

Let's assume the distance from the laser to the object and back is D meters. Since the light has to travel this distance twice (to the object and back), the total distance covered is 2D meters.

The speed of light in a vacuum is approximately 299,792,458 meters per second. Therefore, the time it takes for the light to travel a distance of 2D meters can be calculated using the formula:

Time = Distance / Speed of light

Time = (2D meters) / (299,792,458 meters per second)

Now, you mentioned that you want to measure the object to millimeter accuracy. One millimeter is equal to 0.001 meters. Therefore, we need to ensure that the time measurement is accurate enough to distinguish between two different points on the object that are 0.001 meters apart.

Let's represent the required accuracy in seconds as Δt. To achieve millimeter accuracy, we can set Δt to be equal to the time it takes light to travel 0.001 meters:

Δt = (0.001 meters) / (299,792,458 meters per second)

Now we have calculated the required accuracy in seconds, which is the time it takes light to travel 0.001 meters.

Keep in mind that this is a simplified calculation and doesn't account for factors like the speed of light in a non-vacuum medium or the processing capabilities of the detector. However, this should give you an approximate understanding of the required accuracy in seconds for measuring a 3D object down to millimeter accuracy using laser technology.