posted by Josh
An earth satellite moves in a circular orbit with an orbital speed of 5800 m/s.
Find the time (expressed in seconds) of one revolution of the satellite.
Find the radial acceleration of the satellite in its orbit.
I think we need to know the radius of the earth and the satellite's average distance from the earth in order to answer the question. All we have right now is the angular velocity. If you have that information, then use
d=v*t where d is the diameter=pi*2*r and
a=v^2/r is the angular acceleration.
The length of time it takes for a satellite to orbit the earth, its orbital period, varies with the altitude of the satellite above the earth's surface. The lower the altitude, the shorter the period. The higher the altitude, the longer the period. For example, the orbital period for a 100 mile high satellite is ~88 minutes; 500 miles ~101 minutes; 1000 miles ~118 minutes; 10,000 miles 9hr-18min; 22,238 miles 23hr-56min-4.09sec. A satellite in an equatorial orbit of 22,238 miles altitude remains stationary over a point on the Earth's equator and the orbit is called a geostationary orbit. A satellite at the same 22,238 miles altitude, but with its orbit inclined to the equator, has the same orbital period and is referred to as a geosynchronous orbit as it is in sync with the earth's rotation.
Not surprisingly, the velocity of a satellite reduces as the altitude increases. The velocities at the same altitudes described above are 25,616 fps. (17,426 mph) for 100 miles, 24,441 fps. (16,660 mph.) for 500 miles, 23,177 fps. (15,800 mph.) for 1000 miles, 13,818 fps. (9419 mph) for 10,000 miles, and 10,088 fps. (6877 mph.) for 22,238 miles.
Depending on your math knowledge, you can calculate the orbital velocity and orbital period from two simple expressions. You might like to try them out if you have a calculator.
The time it takes a satellite to orbit the earth, its orbital period, can be calculated from
T = 2(Pi)sqrt[a^3/µ]
where T is the orbital period in seconds, Pi = 3.1416, a = the semi-major axis of an elliptical orbit = (rp+ra)/2 where rp = the perigee (closest) radius and ra = the apogee (farthest) radius from the center of the earth, µ = the earth's gravitational constant = 1.407974x10^16 ft.^3/sec.^2. In the case of a circular orbit, a = r, the radius of the orbit. Thus, for a 250 miles high circular orbit, a = r = (3963 + 250)5280 ft. and T = 2(3.1416)sqrt[[[(3963+250)5280]^3]/1.407974x10^16] = ~5555 seconds = ~92.6 minutes.
The velocity required to maintain a circular orbit around the Earth may be computed from the following:
Vc = sqrt(µ/r)
where Vc is the circular orbital velocity in feet per second, µ (pronounced meuw as opposed to meow) is the gravitational constant of the earth, ~1.40766x10^16 ft.^3/sec.^2, and r is the distance from the center of the earth to the altitude in question in feet. Using 3963 miles for the radius of the earth, the orbital velocity required for a 250 miles high circular orbit would be Vc = 1.40766x10^16/[(3963+250)x5280] = 1.40766x10^16/22,244,640 = 25,155 fps. (17,147 mph.) Since velocity is inversely proportional to r, the higher you go, the smaller the required orbital velocity.
The question here is circumspect: As TchrWill points out, orbital velocity depends on radius. The slower it goes, the higher it is.
Centripetal acceleration= V^2/r
acceleration due to gravity=g[(radiusearth/(radius satellite)]^2
setting these equal...
v^2/r= g re^2/r^2
or r= g re^2/v^2
So period= 2PI r/V= 2PI g re^2/v^3
T= 2*3.14g*(6.38E6)^2 /(5800)^3
You do it. I get over three hours, at a very high altitude.