A radio signal travels at 3 x 10^8 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.54 x 10^7?

3.54x10^7 m /(3x10^8 m/s)

= .118 seconds

To answer the question, we need to determine the distance between the satellite and the surface of the Earth, and then divide it by the speed of a radio signal.

The height of the satellite above the Earth's surface is given as 3.54 x 10^7 meters. However, in order to determine the total distance, we need to add the radius of the Earth to the satellite's height. The average radius of the Earth is approximately 6.371 x 10^6 meters.

So, the total distance from the satellite to the surface of the Earth is:
Total distance = satellite height + Earth's radius
= (3.54 x 10^7) + (6.371 x 10^6)
= 4.177 x 10^7 meters

Now, we can divide this distance by the speed of a radio signal, which is given as 3 x 10^8 meters per second:
Time = Distance / Speed
= (4.177 x 10^7) / (3 x 10^8)
= 0.139 seconds

Therefore, it will take approximately 0.139 seconds for the radio signal to travel from the satellite to the surface of the Earth.