A 100 foot tall antenna sits part way up a hill. The hill makes an angle to 12 degrees with the horizontal. In other words, if you were going to walk up the hill, you would walk at an angle of 12 degrees. To keep the antenna stable, it must be anchored by 2 cables.The distance from the base of the antenna to the down point DOWN hill is 95 feet.Ignore the amount of cable needed to fasten the cable to the antenna or to the tie downs. How much cable is needed?

what about the uphill anchor point?

I assume the antenna is vertical.

Anyway, if the uphill anchor point is x feet from the base, then we have

uphill cable length is u, where

u^2 = x^2+100^2 - 2*100x cos 78°

downhill cable length d is

d^2 = 95^2 + 100^2 - 2*95*100 cos102°

To find out how much cable is needed to anchor the antenna, we can break down the problem into two parts: the horizontal distance and the vertical distance.

First, let's calculate the horizontal distance from the base of the antenna to the down point downhill. To do this, we need to find the length of the base of the right triangle formed by the hill and the horizontal ground. This can be done using trigonometry.

We know that the hill makes an angle of 12 degrees with the horizontal. Let's call the length of the base of the right triangle "x". Using the angle and the trigonometric function tangent (tan), we have:

tan(12 degrees) = x / 95 feet

To solve for x, we can rearrange the equation:

x = 95 feet * tan(12 degrees)

Now, let's calculate the vertical distance. We know that the antenna is 100 feet tall. Since the hill makes an angle with the horizontal, the vertical distance can be calculated as:

vertical distance = 100 feet * sin(12 degrees)

Finally, to find the total length of the cable needed, we can use the Pythagorean theorem:

cable length = square root of (horizontal distance^2 + vertical distance^2)

By plugging in the values we calculated for the horizontal and vertical distances, we can find the total length of the cable needed to anchor the antenna.