Radio signals travel at a rate of 3 × 10^8 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of the Earth if the satellite is orbiting at a height of 3.6 × 10*7 meters?

A.) 8.3 seconds
B.) 1.2 × 10^–1 seconds
C.) 1.08 × 10^16 seconds
D.) 10.8 × 10^15 seconds

The distance the radio signal needs to travel is the sum of the height of the satellite and the radius of the Earth, which is approximately 6.37 x 10^6 meters.

The total distance the radio signal needs to travel is 3.6 x 10^7 + 6.37 x 10^6 = 4.23 x 10^7 meters.

To find the time it takes for the radio signal to travel this distance, we can use the formula time = distance / speed.

Plugging in the values, we get time = (4.23 x 10^7 meters) / (3 x 10^8 meters per second) = 0.14 seconds.

Therefore, the correct answer is B.) 1.2 × 10^–1 seconds.