A radio signal travels at

3
.
00



10
8
3
.
00

·

10
8
meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of
3
.
54



10
7
3
.
54

·

10
7
meters? Show your work.

We can use the equation:

time = distance / speed

The distance in this case is the height of the satellite above the surface of the Earth, which is 3.54 x 10^7 meters. The speed is the speed of light, which is 3.00 x 10^8 meters per second. Plugging these values into the equation, we get:

time = 3.54 x 10^7 / 3.00 x 10^8

time = 0.118 seconds

Therefore, it will take approximately 0.118 seconds for a radio signal to travel from a satellite orbiting at a height of 3.54 x 10^7 meters to the surface of the Earth.