A baseball player hits a line drive to center field. He runs from second to third at 20 feet/sec. How fast is the distance from the runner to home plate changing when he is halfway to third base?

If I recall, the length of a side of a baseball diamond is 90 feet.

Make a diagram of a right-angled triangle using the third base line as one side, the distance of the runner to home plate as the hypotenuse h, and letting the distance of the runner to third be x feet

then x^2 + 90^2 = h^2
2x dx/dt = 2h dh/dt
dx/dt = h (dh/dt) / x

we know dx/dt = -20 ft/sec
and when x= 45
h^2 = 45^2 + 90^2
h = √10125

dh/dt = √10125(-20)/45 = .....

(the negative value of dx/dt shows that the value of x is decreasing, as I defined x)

To determine how fast the distance from the runner to home plate is changing, we need to find the rate of change of the distance with respect to time.

Let's define the following variables:
- x = distance of the runner from home plate
- t = time

Now, we need to find dx/dt, which represents the rate of change of x (distance) with respect to t (time). To do this, we can use the chain rule from calculus.

We know that the runner is traveling at a constant speed of 20 feet/sec from second to third base. Therefore, the rate of change of the distance, dx/dt, is equal to 20 feet/sec.

Now, we want to find how fast the distance from the runner to home plate is changing when he is halfway to third base.

Let's assume the total distance from second base to third base is d feet. Since the runner is halfway to third base, his distance from second base can be represented as x = d/2.

Since we know dx/dt = 20 feet/sec, we can substitute this value into the chain rule expression:

dx/dt = (dx/dt) * (dt/dt)

Since dt/dt is simply equal to 1, we can simplify the expression to:

dx/dt = 20 feet/sec

Therefore, the distance from the runner to home plate is changing at a constant rate of 20 feet/sec when he is halfway to third base.