Radio signals travel at a rate of 3 * 10^8 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of the Earth if the satellite is orbiting at a height of 3.6 * 10^9 meters?
To calculate the time it would take for a radio signal to travel from the satellite to the surface of the Earth, we can use the formula:
Time = Distance / Speed
Given that the speed of a radio signal is 3 * 10^8 meters per second and the distance from the satellite to the surface of the Earth is 3.6 * 10^9 meters, we can plug these values into the formula:
Time = 3.6 * 10^9 / 3 * 10^8
Time = 12 seconds
Therefore, it would take 12 seconds for a radio signal to travel from a satellite orbiting at a height of 3.6 * 10^9 meters to the surface of the Earth.