A plane flying horizontally at an altitude of 1 mi and a speed of 500 mi/h passes directly over a radar station. Find the rate at which the distance from the plane to the station is increasing when it is 2 mi away from the station.

I drew a diagram and figured out I need to find dd/dt which is the distance of the plane from the radar station over time

I found y to be the altitude of the plane (1 mi)

I also found x to be the distance away from the radar station (2 mi)

dc/dt = 500 mi/h

I noticed I could use the pythagorean equation here.

d^2 = x^2 + y^2

I solved for d to be sqrt(5)

and I differentiated the equation to be 2d*(dd/dt) = 2x*(dx/dt) + 2y*(dy/dt)

y is constant therefore dy/dt = 0

I evaluated it to be (dd/dt) = [2(2)(500)]/(2sqrt(5)) or 1000/sqrt(5)
which is equivalent to 447mi/hr

My problem is the answer is 250sqrt(3) which is 433mi/hr

What am I doing wrong?

apparently 2 miles is the straight-line distance, so d=2 and x=sqrt(3)

2d dd/dt = 2x dx/dt
2(2) dd/dt = 2sqrt(3)*500
dd/dt = 250sqrt(3)

I'm confused as to how you got d =2 and x=sqrt(3)

To find the rate at which the distance from the plane to the radar station is increasing when it is 2 miles away from the station, we need to differentiate the equation d^2 = x^2 + y^2 with respect to time.

Let's clarify the given values:
- The altitude of the plane (y) is 1 mile.
- The distance away from the radar station (x) is 2 miles.
- The horizontal speed of the plane (dx/dt) is 500 mph.

Differentiating the equation d^2 = x^2 + y^2 with respect to time (t), we get:
2d*(dd/dt) = 2x*(dx/dt) + 2y*(dy/dt)

Since the altitude (y) is constant, dy/dt is equal to 0.

Plugging in the given values, we have:
2*(distance from the plane to the radar station)*(dd/dt) = 2*(distance away from the radar station)*(horizontal speed of the plane)

Now, let's substitute the known values:
2*(d)*(dd/dt) = 2*(2)*(500)

Simplifying the equation:
2d*(dd/dt) = 4*(500)
2d*(dd/dt) = 2000

Now, we can solve for (dd/dt):
dd/dt = 2000 / (2d)

Substituting d with sqrt(5):
dd/dt = 2000 / (2 * sqrt(5))
dd/dt = 1000 / sqrt(5)
dd/dt = 200 sqrt(5)

Therefore, the rate at which the distance from the plane to the radar station is increasing is 200 sqrt(5) mi/hr, which is approximately 447 mi/hr.

It seems that there was a mistake in your calculation. The correct answer is indeed 200 sqrt(5) mi/hr, not 250 sqrt(3) mi/hr.