trig
posted by John .
In a computer simulation, a satellite orbits around Earth at a distance from the Earth's surface of 2.1 X 104 miles. The orbit is circular, and one revolution around Earth takes 10.5 days. Assuming the radius of the Earth is 3960 miles, find the linear speed (velocity) of the satellite.

speed is distance/time
for a circle, C=2pi*r
r = 3960+104 = 4064
C = 25534
speed = 25534mi/10.5day = 2432mi/day 
/r] sqrt(sqrt[(1.407974x10^16)/(3960+218.4)5280]
= 25,262 feet per second.
The orbital period is
T = 2(3.14)sqrt[22,061,952^3/1.407974x10^16] = 5487 seconds or 91.45 minutes. 
Part of my previous reply was lost in the posting process.
The orbital radius is 3960 + 2.1(104) = 4178.4 miles or 22,061,952 feet.
The alleged time to complete one orbit is 10.5(24)3600 = 907,200 seconds making the derived orbital velocity
Vc = 152.7fps.
Unfortunately, the real orbital velocity required to remain in a circular orbit derives from Vc=sqrt(µ/r)=
where µ = the earth's gravitational constant and r = the orbital radius in feet =
sqrt[(1.407974x10^16)/(3960+218.4)5280]
= 25,262 feet per second.
The orbital period is
T = 2(Pi)sqrt[r^3/µ]=
2(3.14)sqrt[22,061,952^3/1.407974x10^16] = 5487 seconds or 91.45 minutes.