If a satellite completes an orbit 820 miles above the earth in 11 hours at a velocity of 22,000 mph, how long would it take a satellite to complete an orbit if it is at 1400 miles above the earth at a velocity of 36,000 mph? (Use 36,000 miles as the radius of the earth. )

The answer should be 7.54 hours. What is the formula I need to get this answer?

The formula that will clear up your problem is

......Vc = sqrt(µ/r)
where Vc = the velocity required to keep a body in a circular orbit, in feet/sec., r = the orbital radius in feet and µ = the Earth's gravitational constant or 1.407974x10^16 ft^3/sec^2.

The numbers you offer are inconsistent and unreal. Also, the radius of the earth is 3963 miles or 20,924,640 feet.

The real circular velocity for an orbit of 820 miles altitude is
Vc = sqrt(1.407974x10^16/(3963+820)5280
Vc = 23,612 fps or 16,099 mph.

The orbital period is therefore
Tc = (3963+820)5280(2)3.14/23,612 hr
Tc = 112 min.

The orbital period may also be derived from
Tc = 2(Pi)sqrt(r^3/µ)
Tc = 6.28sqrt(25,254,240^3/1.407974x10^16)
Tc = 6720 sec. or 1.866 hr or 112 min.

For the 1400 mile altitude,
Tc = 15,203mph and Tc = 132.98 min.

Tc(1400)/Tc(840) = r(1400)^(3/2)/r(840)^(3/2)

Lucy have 211 coins how much she have

To find the time it takes for a satellite to complete an orbit at a different altitude and velocity, we can use the concept of orbital period and the relationship between altitude and velocity.

The orbital period of a satellite can be calculated using Kepler's third law of planetary motion, which states that the square of the orbital period is proportional to the cube of the semi-major axis. The semi-major axis is the average distance of the satellite from the center of the Earth.

First, let's calculate the orbital period of the satellite in the initial scenario, where it is 820 miles above the Earth's surface at a velocity of 22,000 mph.

1. Convert the altitude from miles to the radius of the Earth plus the altitude:
Altitude = 820 miles
Radius of the Earth = 36,000 miles
Semi-major axis = Radius of the Earth + Altitude

Semi-major axis = 36,000 + 820 = 36,820 miles

2. Use the formula for the orbital period:
T₁² ∝ a₁³

T₁² = k · a₁³

Where T₁ is the orbital period and a₁ is the semi-major axis.
k is a constant, meaning T₁²/a₁³ = k, or T₁² = k · a₁³

3. Calculate the value of k:
T₁² = k · a₁³
11² = k · (36,820)³

k = (11²) / (36,820)³

4. Use the value of k to find the orbital period at the new altitude and velocity.
T₂² = k · a₂³

Where T₂ is the orbital period at the different altitude, a₂ is the semi-major axis at the new altitude.
We know that a₂ = Radius of the Earth + Altitude₂

Altitude₂ = 1400 miles
Semi-major axis₂ = Radius of the Earth + Altitude₂

Semi-major axis₂ = 36,000 + 1400 = 37,400 miles

Plug in the values:
T₂² = k · (37,400)³

Solve for T₂ by taking the square root of both sides:
T₂ = √(k · (37,400)³)

5. Calculate T₂ to find out the time it takes for the satellite to complete an orbit at the new altitude and velocity.

Input the values of k and calculate:
T₂ = √(k · (37,400)³)

T₂ ≈ √((11²) / (36,820)³ · (37,400)³)

T₂ ≈ √((11²) · (37,400)³) / (36,820)³)

T₂ ≈ √(1.63 × 10¹⁸) / (1.11 × 10¹⁹)

T₂ ≈ 3.618210627 hours

Therefore, it would take approximately 3.618 hours for the satellite to complete an orbit at an altitude of 1400 miles and a velocity of 36,000 mph.