How long will a radio wave, with a wave length of 200 m takes to travel to and for the moon from the earth? (Distance of moon from earth = 3.9 x 108 m)

speed of electromagnetic wave is about 3*10^8 m/s

distance = 2 *3.9*10^8 m

time = distance/speed
= 7.8*10^8/3*10^8
=7.8/3 seconds

To determine how long it would take for a radio wave to travel to and from the moon, we need to calculate the total distance traveled by the radio wave and then divide it by the speed of light.

The distance from the Earth to the moon is given as 3.9 x 10^8 meters, and the wavelength of the radio wave is given as 200 meters.

First, we calculate the total distance traveled by the radio wave. Since it has to travel to the moon and back, the total distance is 2 times the distance from the Earth to the moon:

Total Distance = 2 * Distance from Earth to Moon = 2 * 3.9 x 10^8 meters

Next, we need to calculate the speed of light. The speed of light in a vacuum is approximately 3 x 10^8 meters per second.

Now, we can calculate the time it would take for the radio wave to travel to and from the moon:

Time = Total Distance / Speed of Light

Time = (2 * 3.9 x 10^8 meters) / (3 x 10^8 meters per second)

Simplifying the equation, we get:

Time ≈ 2.6 seconds

Therefore, a radio wave with a wavelength of 200 meters would take approximately 2.6 seconds to travel to and from the moon.