Radio signals travel at a rate of 3 × 108 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.6 × 107 meters? (Hint: Time is distance divided by rate.)

(1 point)
Responses

8.3 seconds
8.3 seconds

1.2 × 10–1 seconds
1.2 × 10 –1 seconds

1.08 × 1016 seconds
1.08 × 10 16 seconds

10.8 × 1015 seconds
10.8 × 10 15 seconds

To calculate the time it takes for a radio signal to travel from the satellite to the surface of the Earth, we can use the formula: Time = Distance / Rate.

The distance from the satellite to the surface of the Earth is given as 3.6 × 10^7 meters, and the rate of the radio signal is 3 × 10^8 meters per second.

Plugging in these values into the formula, we get: Time = (3.6 × 10^7 meters) / (3 × 10^8 meters per second).

Simplifying the expression, we get: Time = 1.2 × 10^-1.

Therefore, the correct answer is 1.2 × 10^-1 seconds.