Radio signals travel at a rate of 3 × 10^8 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.6 × 10^7 meters? (Hint: Time is distance divided by rate.)

A. 8.3 seconds
B. 1.2 * 10^-1 seconds
C. 1.08 * 10^16 seconds
D. 10.8 * 10^15 seconds

B. 1.2 * 10^-1 seconds

To calculate the time it would take for a radio signal to travel from the satellite to the surface of the Earth, we first need to determine the total distance the signal would need to travel. This would be the distance from the satellite to the Earth's surface, which is the sum of the height of the satellite above the surface of the Earth (3.6 × 10^7 meters) and the radius of the Earth (approximately 6.37 × 10^6 meters).

Total distance = 3.6 × 10^7 meters + 6.37 × 10^6 meters
Total distance = 3.237 × 10^7 meters

Now, we can calculate the time it takes for the signal to travel this distance using the formula: time = distance/rate

Time = (3.237 × 10^7 meters) / (3 × 10^8 meters/second)
Time = 1.079 seconds

Therefore, the correct answer is B. 1.2 * 10^-1 seconds.