Radio signals travel at a rate 3 * 10^8 of meters per second. How many seconds would it take a radio signal to travel from a satellite to the surface of the Earth if the satellite is orbiting at a height of 3.6 * 10^7 meters?

A.) 8.3 seconds**
B.) 1.2 * 10^-1 seconds
C.) 1.08 * 10^16 seconds

3.6*10^7/3*10^8 = 3.6/30 = ?

To calculate the time it takes for a radio signal to travel from a satellite to the surface of the Earth, we can use the formula:

Time = Distance / Speed

The distance is given as the height of the satellite, which is 3.6 * 10^7 meters. The speed of radio signals is given as 3 * 10^8 meters per second.

Plugging the values into the formula:

Time = (3.6 * 10^7) / (3 * 10^8)

Simplifying the expression, we can divide both the numerator and denominator by 10^7:

Time = (3.6 / 3) * (10^7 / 10^8)

Time = 1.2 * 10^1 * 10^(-1)

Time = 1.2 * 10^0

Since any number multiplied by 10^0 is equal to itself, the time taken by the radio signal to travel from the satellite to the surface of the Earth is:

Time = 1.2 seconds

Therefore, the correct answer is B.) 1.2 seconds.

To calculate the time it takes for a radio signal to travel from a satellite to the surface of the Earth, we can use the formula:

Time = Distance / Speed

In this case, the distance is the height of the satellite, which is 3.6 * 10^7 meters, and the speed is the rate at which radio signals travel, which is 3 * 10^8 meters per second.

Substituting these values into the formula:

Time = (3.6 * 10^7 meters) / (3 * 10^8 meters per second)

Now, let's simplify the expression:

Time = (3.6 / 3) * (10^7 / 10^8) seconds

The denominator can be simplified as 10^(7-8) = 10^(-1), and 3.6 / 3 = 1.2:

Time = 1.2 * (10^(-1)) seconds

Therefore, the time it takes for a radio signal to travel from the satellite to the surface of the Earth is 1.2 * 10^-1 seconds, which is option B.