Plane A and B ate both flying eastward with constant velocity in the same horizontal plane at angles 30 degrees and 60 degrees with respect to the horizontal respectively. The speed of A is 100 square root of 3 miles per second. Initially, the pilot of plane A sees plane B directly ahead at the same altitude at a horizontal distance of 500 meters. Assuming the two planes will not change directions and altitude, kn how many seconds will plane A crash with plane B?

To determine when plane A will crash with plane B, we need to calculate the time it will take for plane A to cover the horizontal distance of 500 meters.

First, we need to find the horizontal speed of plane A. We are given that the speed of plane A is 100√3 miles per second. However, the given horizontal distance is in meters, so we need to convert the speed to meters per second.

1 mile = 1609.34 meters

Therefore, the horizontal speed of plane A is:
100√3 miles/second × 1609.34 meters/mile = 160934√3 meters/second

Now, we can calculate the time it will take for plane A to travel 500 meters horizontally:
Time = Distance / Speed

Time = 500 meters / (160934√3 meters/second)

To simplify the calculation, let's first simplify the denominator:
160934√3 meters/second = (160934 * 3)^(1/2) meters/second = 491709 meters/second

Substituting this value into the time formula:
Time = 500 meters / 491709 meters/second

Time = 0.0010175074 seconds

Therefore, it will take approximately 0.0010175074 seconds for plane A to crash with plane B.

they must be flying in the same vertical plane, if they are traveling with an angle relative to the horizontal ...

Also, I suspect A's speed is 100√3 m/s, not mi/s.

Using the law of cosines,

500^2 + (100√3 t)^2 - 2(500)(100√3 t)(√3/2) = (100√3 t)^2

They will crash in 5/3 seconds.