I apologise for reposting this question but it is still not working out for me.

I know that:

speed of light c=3.0x10^8
Time interval delta t=0.19x10^-9s

The relationship between actual time and roudtrip distance is

d=ct

Also if there is and uncertainty of delta time, then it introduces an uncertainty of delta distance

ie d+delta d = c(t + delta t)

Knowing all of this I still don't understand how to work out the problem. I am continuosly coming up with the incorrect answer. please help!

Question:
The distance between earth and the moon can be determined from the time it takes for a laser beam to travel from earth to a reflector on the moon and back. If the round-trip time can be measured to an accuracy of 0.19 of a nanosecond (1 ns = 10-9 s), what is the corresponding error in the earth-moon distance?

The Earth moon distance is d, and the round-trip travel time is

t = 2 d/ c
d = (c/2) t

delta d = 1.5*10^8 m/s * (delta t)
= 0.19*10^-9 s *(1.5*10^8 m/s) = 2.9*10^-2 m = 2.9 cm
(about one inch)