Plane A is 40 mi south and 100 mi east of Plane B. Plane A is flying 2 miles west for every mile it flies north, while Plane B is flying 3 mi east for every mile it flies south.

a. Where do their paths cross?

b. Which plane must fly farther?

c. What ratio of the speed of Plane B to the speed of Plane A would
produce a midair collision?

Your previous post has been answered:

http://www.jiskha.com/display.cgi?id=1254150016

To solve these problems, we need to understand the relative motion of the planes and apply some basic concepts of trigonometry.

a. To find where their paths cross, we can start by representing the position of each plane as coordinates, considering Plane B as the origin (0,0) since it is stationary.

For Plane A, which is 40 miles south and 100 miles east of Plane B, we can represent its position as (-100, -40).

Next, we need to consider the relative motion of the planes. Plane A moves 2 miles west for every mile it flies north, which means it has a gradient of -2. Plane B moves 3 miles east for every mile it flies south, which means it has a gradient of 1/3.

Equating the equations of their paths, we get:

-2n + 100 = (1/3)s
-2n = (1/3)s - 100

Where n represents the north distance covered by Plane A, and s represents the south distance covered by Plane B.

Since the planes cross paths, the north distance of A will be the same as the south distance of B. So, we can set n = s.

-2n = (1/3)n - 100
-2n - (1/3)n = -100
-(7/3)n = -100
n = (3/7)(-100)
n = -300/7

Therefore, the paths of Plane A and Plane B cross at a latitude of approximately -42.86 miles (rounded to two decimal places).

b. To determine which plane must fly farther, we need to compare the distances covered by each plane.

The distance traveled by Plane A can be calculated using the Pythagorean theorem:

Distance A = sqrt((-100)^2 + (-40)^2)
Distance A = sqrt(10000 + 1600)
Distance A = sqrt(11600)
Distance A ≈ 107.68 miles (rounded to two decimal places)

The distance traveled by Plane B can be calculated using the formula:

Distance B = sqrt(100^2 + 40^2)
Distance B = sqrt(10000 + 1600)
Distance B = sqrt(11600)
Distance B ≈ 107.68 miles (rounded to two decimal places)

As we can see, both distances are approximately equal, so neither plane must fly farther.

c. To find the ratio of the speed of Plane B to the speed of Plane A for a midair collision, we need to consider the speeds at which they are flying.

Since Plane A flies 2 miles west for every mile it flies north, we can assume its speed is sqrt(2^2 + 1^2) = sqrt(5) miles per hour.

Since Plane B flies 3 miles east for every mile it flies south, we can assume its speed is sqrt(3^2 + 1^2) = sqrt(10) miles per hour.

To produce a midair collision, the relative speeds of the two planes must be equal. Therefore, the ratio of the speed of Plane B to the speed of Plane A would be sqrt(10) : sqrt(5), which is approximately 1.414 : 1 (rounded to three decimal places).

So, Plane B would need to fly approximately 1.414 times faster than Plane A to produce a midair collision.