Radio signals travel at a rate of 3 × 10^8 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of the Earth if the satellite is orbiting at a height of 3.6 × 10*7 meters?

We can use the formula distance = speed × time to solve this problem.

The distance between the satellite and the surface of Earth is the sum of the radius of Earth and the height of the satellite. The radius of Earth is approximately 6.4 × 10^6 meters.

So the total distance between the satellite and the surface of Earth is (6.4 × 10^6 meters) + (3.6 × 10^7 meters) = 4 × 10^7 meters.

Using the formula distance = speed × time, we can rearrange to solve for time: time = distance / speed.

Plugging in the values, we have:
time = (4 × 10^7 meters) / (3 × 10^8 meters per second)

Dividing the numerator and denominator by 10^7, we get:
time = 4 / 3 seconds

Therefore, it would take 4/3 seconds for a radio signal to travel from the satellite to the surface of the Earth.