A plane flying horizontally at an altitude of 2 mi and a speed of 420 mi/h passes directly over a radar station. Find the rate at which the distance from the plane to the station is increasing when it is 5 mi away from the station.

Much appreciated

To find the rate at which the distance from the plane to the radar station is increasing, we need to use related rates.

Let's consider the situation as shown in the diagram below:

A
/|\
/ | \
/ | \
d /___|___\ 2 mi
B
|
| 5 mi
|
|
C (radar station)

Let:
A be the position of the plane,
B be the current distance between the plane and the radar station,
C be the radar station itself,
d be the horizontal distance between the plane and the radar station.

We are looking for the rate at which the distance, BC, is increasing (dB/dt) when BC = 5 mi.

From the information given, we know that AB = 2 mi and the plane is flying horizontally, so the triangle ABC is a right triangle.

Let's use the Pythagorean Theorem to relate the variables:

AC^2 = AB^2 + BC^2

Differentiating both sides with respect to time (t), we have:

(2 * AC * dAC/dt) = 0 + (2 * BC * dB/dt)

Simplifying, we get:

AC * dAC/dt = BC * dB/dt

Since the plane is flying horizontally, dAC/dt is the horizontal component of the velocity of the plane, which is given as 420 mi/h.

So, we have:

AC * 420 = BC * dB/dt

We're given that BC = 5 mi, so substituting the values:

2 * 5 * 420 = 5 * dB/dt

Simplifying further:

10 * 420 = 5 * dB/dt

4200 = 5 * dB/dt

Finally, solving for dB/dt:

dB/dt = 4200 / 5

dB/dt = 840 mi/h

Therefore, the rate at which the distance from the plane to the station is increasing when it is 5 mi away from the station is 840 mi/h.

dfedrdftgvnh b