A satellite orbits at an average altitude of h = 423 km. How long does it need to complete one full orbit? Use REarth = 6370 km and mEarth = 5.98 x 10^24 kg

force gravity=centipetalfoce

GMm/(r+h)^2= m (v)^2/(r+h)
but v= 2pi (r+h)/T
so solve for T.

what is GMm? is it gravity (9.81) times m of Earth?

To calculate the time it takes for a satellite to complete one full orbit around the Earth, we can use Kepler's third law, which states that the square of the orbital period (T) is directly proportional to the cube of the semi-major axis (a) of the satellite's orbit.

First, let's find the semi-major axis of the satellite's orbit. It is given that the average altitude of the satellite is h = 423 km. The semi-major axis (a) is equal to the sum of the Earth's radius (REarth) and the satellite's altitude (h).

a = REarth + h

Substituting the values, we get:

a = 6370 km + 423 km
a = 6793 km

Now, we can find the orbital period (T) using Kepler's third law. The equation can be written as:

T^2 = (4 * pi^2 * a^3) / (G * M)

where T is the orbital period, pi is approximately 3.14, G is the gravitational constant (approximately 6.67 x 10^-11 N m^2/kg^2), and M is the mass of the Earth (mEarth).

Rearranging the equation to solve for T, we get:

T = sqrt((4 * pi^2 * a^3) / (G * M))

Substituting the values into the equation, we find:

T = sqrt((4 * 3.14^2 * (6793 km)^3) / (6.67 x 10^-11 N m^2/kg^2 * 5.98 x 10^24 kg))

T = sqrt(1.67 x 10^14 s^2)

T = 1.29 x 10^7 s

Therefore, it takes approximately 12.9 million seconds for the satellite to complete one full orbit around the Earth at an average altitude of 423 km.