Radio signals travel at a rate of 3 * 10^8 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 9.6 * 10^6 meters? (1 point)

3.2 x 10^2 seconds
3.2 x 10^-2 seconds
3.13 x 10^1 seconds
2.88 x 10^15 seconds

I think I'm close to getting it... but I'm still not quite sure. Please help?

So is SkatingDJ right?

I meant PsyDAG

Great, thanks so much:) I was debating if it was the -2 or the positive 2 exponent. Thanks again!

Hello. Mine's different...

Radio signals travel at a rate of 3 x 10^8 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of the Earth if the satellite is orbiting at a height of 3.6 x 10^9 meters?

To solve this problem, you can use the formula:

Time = Distance / Speed

The distance in this case is the height of the satellite above the Earth's surface, which is 9.6 * 10^6 meters. The speed is the rate at which radio signals travel, which is 3 * 10^8 meters per second.

Now, we can substitute these values into the formula to find the time it takes for the radio signal to travel from the satellite to the surface of the Earth:

Time = (9.6 * 10^6 meters) / (3 * 10^8 meters/second)

To simplify this calculation, we can divide both the numerator and the denominator by 10^6:

Time = (9.6 / 3) * 10^6 meters / (10^8 meters/second)

Time = 3.2 * 10^6 meters / (10^8 meters/second)

Now, we can simplify further by dividing the numerator and denominator by 10^6:

Time = (3.2 / 10^2) * 10^6 seconds

Time = 3.2 * (10^6 / 10^2) seconds

Time = 3.2 * 10^4 seconds

Therefore, it would take 3.2 x 10^4 seconds for a radio signal to travel from the satellite to the surface of the Earth.

So, the correct answer is 3.2 x 10^4 seconds.

10^6/10^8 = 10^-2