two jets left an airport at the same time one traveled east at 300 mph the other traveled south at 400 mph how far apart were the Jets at the end of an hour

To find out how far apart the jets were at the end of an hour, we can use the Pythagorean theorem.

One jet traveled east at 300 mph for an hour, so it covered a distance of 300 miles (speed x time = distance).
The other jet traveled south at 400 mph for an hour, so it covered a distance of 400 miles.

Now, we have a right-angled triangle, with the distances covered by the jets as the two legs. Let's call the distance between the jets "d".

According to the Pythagorean theorem, the square of the hypotenuse (d) is equal to the sum of the squares of the other two sides.

So, d² = 300² + 400²
d² = 90,000 + 160,000
d² = 250,000

Taking the square root of both sides, we find:
d ≈ √250,000
d ≈ 500

Therefore, the jets were approximately 500 miles apart at the end of an hour.

To find the distance between the two jets after an hour, we can use the Pythagorean theorem because they are travelling at right angles to each other.

Let's denote the eastward distance as "x" and the southward distance as "y".

The first jet travels east at a speed of 300 mph for one hour, so its distance covered would be 300 miles (300 mph * 1 hour = 300 miles).

The second jet travels south at a speed of 400 mph for one hour, so its distance covered would be 400 miles (400 mph * 1 hour = 400 miles).

So, from the Pythagorean theorem, we know that the square of the hypotenuse of a right-angled triangle (distance between the jets) is equal to the sum of the squares of the other two sides.

Hence, x^2 + y^2 = 300^2 + 400^2.

Simplifying, we have x^2 + y^2 = 90000 + 160000.

x^2 + y^2 = 250000.

Taking the square root of both sides, we get sqrt(x^2 + y^2) = sqrt(250000).

Therefore, the distance between the two jets after one hour is sqrt(250000) miles.

Evaluating, the distance is approximately 500 miles.