An attack submarine is at a depth of 560 feet below the surface and detects a ship on the surface with its sonar that is at a 1.5 degree angle of elevation. What is the actual distance from the submarine to the ship in yards. Additionally is the sonar travels through sea water at an average velocity of 1500 meters/second, how long will it take in seconds for the sonar signal to get back to the sub after it is releases?

To calculate the actual distance from the submarine to the ship, we can use trigonometry. Let's assume the distance from the submarine to the base of the ship is x yards.

Using trigonometry, we can determine that:
tan(1.5 degrees) = x / 560

Rearranging the equation to solve for x:
x = tan(1.5 degrees) * 560

To convert the result from feet to yards, we divide by 3 since there are 3 feet in a yard.
x = (tan(1.5 degrees) * 560) / 3

Now let's calculate the actual distance:
x ≈ 539.82 yards

Therefore, the actual distance from the submarine to the ship is approximately 539.82 yards.

To calculate the time it takes for the sonar signal to reach the submarine, we need to know the round-trip distance. As the sonar signal travels to the ship and back, the round-trip distance is twice the initial distance between the submarine and the ship. Hence, the round-trip distance is approximately 1079.64 yards (2 * 539.82).

To calculate the time, we divide the round-trip distance by the velocity of sound in water.
time = round-trip distance / velocity

Converting the velocity to yards/second (since the distance is in yards):
velocity = 1500 meters/second * 1.094 yards/meter

Now let's calculate the time it takes for the sonar signal to get back to the submarine:
time = (1079.64 yards) / (1500 meters/second * 1.094 yards/meter)

time ≈ 0.5085 seconds

Therefore, it will take approximately 0.5085 seconds for the sonar signal to get back to the submarine after it is released.