Radio signals travel at a rate of 3 × 108 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.6 × 107 meters? (Hint: Time is distance divided by rate.) (1 point) Responses 8.3 seconds 8.3 seconds 1.2 × 10–1 seconds 1.2 × 10 –1 seconds 1.08 × 1016 seconds 1.08 × 10 16 seconds 10.8 × 1015 seconds

8.3 seconds

Distance = height of satellite + radius of Earth
Distance = 3.6 x 10^7 + 6.4 x 10^6 (radius of Earth)
Distance = 4 x 10^7 meters

Time = Distance / Rate
Time = 4 x 10^7 / 3 x 10^8
Time = 1/7 = 0.142857 seconds

Rounded to one decimal place, it would take approximately 8.3 seconds for a radio signal to travel from a satellite to the surface of Earth.