Radio signals travel at a rate of 3 × 108 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.6 × 107 meters? (Hint: Time is distance divided by rate.)%0D%0A(1 point)%0D%0AResponses%0D%0A%0D%0A8.3 seconds%0D%0A8.3 seconds%0D%0A%0D%0A1.2 × 10–1 seconds%0D%0A1.2 × 10 –1 seconds%0D%0A%0D%0A1.08 × 1016 seconds%0D%0A1.08 × 10 16 seconds%0D%0A%0D%0A10.8 × 1015 seconds

To find the time it takes for the radio signal to travel from the satellite to the surface of Earth, divide the distance by the rate:

Time = Distance / Rate

Distance = height of satellite = 3.6 × 10^7 meters
Rate = 3 × 10^8 meters per second

Time = 3.6 × 10^7 meters / (3 × 10^8 meters per second) = 0.12 seconds

So, it would take 0.12 seconds for the radio signal to travel from the satellite to the surface of Earth.

The correct answer is:
1.2 × 10^(-1) seconds