A plane flies northwards for a speed of 430km. It then flies Eastwards for 380km . how far is it from its starting point.(ignore its height above the ground.) Use Pythagoras rule.

please answer

Pleaaaaaaaaaaaaaaaaaaaaaaase answer

Rubbish

To find the distance from the starting point, we can use the Pythagorean theorem, which states that in a right triangle, the square of the hypotenuse (the side opposite the right angle) is equal to the sum of the squares of the other two sides.

In this case, the distance flown northwards and eastwards form the legs of the right triangle, and we need to find the hypotenuse, which represents the distance from the starting point.

Let's call the distance flown northwards as side A (430 km), the distance flown eastwards as side B (380 km), and the distance from the starting point as the hypotenuse C (which we need to find).

The Pythagorean theorem can be written as follows:

C^2 = A^2 + B^2

Substituting the given values:

C^2 = (430 km)^2 + (380 km)^2

Now, we can calculate C by taking the square root of both sides of the equation:

C = sqrt((430 km)^2 + (380 km)^2)

Using a calculator, we can evaluate this expression to find the value of C.

C = sqrt(184,900 km^2 + 144,400 km^2)

C = sqrt(329,300 km^2)

C ≈ 573.37 km

Therefore, the plane is approximately 573.37 km from its starting point.

so, use Pythagoras rule.

c^2 = a^2+b^2

You have a and b, so just plug and chug.