Radio signals travel at a rate of 3 × 108 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.6 × 107 meters? (Hint: Time is distance divided by rate.)(1 point)Responses

It would take 0.12 seconds for a radio signal to travel from a satellite orbiting at a height of 3.6 x 10^7 meters to the surface of Earth.

Time = distance / rate
Time = 3.6 x 10^7 meters / 3 x 10^8 meters per second
Time = 0.12 seconds