Radio signals travel at a rate of 3 × 10^8 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of the Earth if the satellite is orbiting at a height of 3.6 × 10*7 meters? (Hint: Time is distance divided by rate)

A.) 8.3 seconds
B.) 1.2 × 10^–1 seconds
C.) 1.08 × 10^16 seconds
D.) 10.8 × 10^15 seconds

To find the time it takes for a radio signal to travel from the satellite to the surface of the Earth, we need to divide the distance by the rate.

Distance = Satellite height + Earth radius = 3.6 × 10^7 meters + 6.4 × 10^6 meters = 4.04 × 10^7 meters

Time = Distance / Rate = (4.04 × 10^7 meters) / (3 × 10^8 meters/second)

Simplifying the expression, we divide the numerator and denominator by 10^7:

Time = (4.04 × 10^7 / 10^7) / (3 × 10^8 / 10^7) = 4.04 / 3 = 1.35 seconds.

Therefore, the correct answer is not listed, but it would be 1.35 seconds.