A radio signal travels at

3.00⋅10^8
meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of
3.54⋅10^7 meters? Show your work.

We can use the formula:

time = distance / speed

In this case, the distance is the height of the satellite above the Earth's surface:

distance = 3.54⋅10^7 meters

The speed is the speed of light (which is essentially the same as the speed of a radio signal):

speed = 3.00⋅10^8 meters per second

Plugging in these values, we get:

time = 3.54⋅10^7 meters / 3.00⋅10^8 meters per second

Simplifying this fraction, we can cancel out a factor of 10^7:

time = 3.54 / 3 = 1.18 seconds

Therefore, it would take a radio signal 1.18 seconds to travel from the satellite to the surface of the Earth.