A satellite, B is 17,000 miles from the horizon of Earth, Earth's radius is about 4,000 miles. Find the approximate distance the satellite is from the Earths surface.

possilbe answers
13,464
17,464
20,957
21,000
I do not understand how to do these problems

Clearly, I did not understand you. But after reviewing the scenario, the answer is clearly

d = sqrt(17000^2 + 4000^2)-4000 = 13,464

a satellite is 6,800 miles from the horizon of earth. earths radius is about 4,000 miles. find the approximate distance the satellite is from the point directly below it on earths surface?

To solve this problem, we can use the Pythagorean theorem, which states that in a right triangle, the square of the hypotenuse (the side opposite the right angle) is equal to the sum of the squares of the other two sides.

In this case, the Earth's radius and the distance from the satellite to the Earth's surface form the two sides of a right triangle. The distance from the satellite to the horizon (17,000 miles) is the hypotenuse.

Let's denote the distance from the satellite to the Earth's surface as "x."

According to the Pythagorean theorem, we can write the following equation:

x^2 + 4,000^2 = 17,000^2

Now, we can solve for "x" by simplifying and taking the square root of both sides of the equation:

x^2 = 17,000^2 - 4,000^2

x^2 = 289,000,000 - 16,000,000

x^2 = 273,000,000

x ≈ √273,000,000

x ≈ 16,523

Therefore, the approximate distance from the satellite to the Earth's surface is 16,523 miles.

Among the possible answers given, the closest one to our calculated value is 17,464 miles.

If I understand you correctly the distance you seek is

d = sqrt(17000^2 - 4000^2) - 4000