Two jets left an airport at the same time one tracked east at 300 miles per hour. The other traveled south at 400 miles per hour how far apart were the keys at the end of the hour

To solve this problem, we can use the Pythagorean theorem.

Let's assume the distance travelled by the eastbound jet is x miles. Then, the distance travelled by the southbound jet would be x miles as well (since they both left at the same time).

According to the Pythagorean theorem, the square of the hypotenuse (distance between the two jets) is equal to the sum of the squares of the other two sides.

So, the equation becomes:

x^2 + x^2 = (300^2) + (400^2)

Simplifying this equation:

2x^2 = 90000 + 160000

2x^2 = 250000

Dividing both sides by 2:

x^2 = 125000

Taking the square root of both sides:

x = √125000

x ≈ 353.55 miles

Thus, the distance between the two jets at the end of the hour is approximately 353.55 miles.

To find out how far apart the jets were at the end of the hour, we can use the Pythagorean theorem.

Let's assume that the starting point of both jets is the origin (0,0) on a coordinate plane.

The first jet traveled east at 300 miles per hour, so its position after 1 hour would be (300, 0) on the x-axis.

The second jet traveled south at 400 miles per hour, so its position after 1 hour would be (0, -400) on the y-axis.

To find the distance between the two points, we can calculate the hypotenuse of the right-angled triangle formed by the two positions.

Using the Pythagorean theorem, the distance between the two points is:
√((300)^2 + (-400)^2)

Simplifying, we have:
√(90000 + 160000)
√(250000)
500

Therefore, the two jets were 500 miles apart at the end of the hour.

To find the distance between the two jets at the end of the hour, we can use the Pythagorean theorem, which relates the sides of a right triangle.

The eastbound jet travels at 300 miles per hour for 1 hour, so it traveled 300 miles.

The southbound jet travels at 400 miles per hour for 1 hour, so it traveled 400 miles.

Now, we can create a right triangle with the distances traveled by the two jets as the legs of the triangle. The distance between the two jets is the hypotenuse.

Using the Pythagorean theorem: c^2 = a^2 + b^2

Where:
c is the hypotenuse (distance between the jets)
a is the distance traveled by the eastbound jet (300 miles)
b is the distance traveled by the southbound jet (400 miles)

Plugging in the values, we have:
c^2 = 300^2 + 400^2

Calculating the squares:
c^2 = 90000 + 160000
c^2 = 250000

Taking the square root of both sides:
c = √250000
c ≈ 500

Therefore, the distance between the jets at the end of the hour is approximately 500 miles.