Radio signals travel at a rate of 3 × 10^8 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of the Earth if the satellite is orbiting at a height of 3.6 × 10x^7 meters? (Hint: Time is distance divided by rate)

A.) 8.3 seconds
B.) 1.2 × 10^–1 seconds
C.) 1.08 × 10^16 seconds
D.) 10.8 × 10^15 seconds

To calculate the time it takes for a radio signal to travel from a satellite to the surface of the Earth, we need to divide the distance between the satellite and the surface of the Earth by the rate at which radio signals travel.

The height of the satellite is given as 3.6 × 10^7 meters.

The rate at which radio signals travel is given as 3 × 10^8 meters per second.

So, the time it takes for the radio signal to travel from the satellite to the surface of the Earth is:
Time = Distance / Rate
Time = (3.6 × 10^7 meters) / (3 × 10^8 meters per second)
Time = (3.6 × 10^7) / (3 × 10^8) = (3.6/3) × 10^(-1 - 8)
Time = 1.2 × 10^(-1 - 8)
Time = 1.2 × 10^(-9)

Therefore, it would take 1.2 × 10^(-9) seconds for the radio signal to travel from the satellite to the surface of the Earth.

The correct answer is B.) 1.2 × 10^–1 seconds.