A quarterback throws a football with a velocity vo at an angle of 45o with the horizontal. At the same instant a receiver standing 20 ft in front starts running down the field at 15 ft/s and catches the ball. What is the distance of the receiver from the quarterback when the ball is caught. Assume the ball is thrown and caught at the same height above the ground.

distance ball thrown = receiver's distance from QB when caught

[2 vo sin 45/g]*vr + 20 = vo^2/g
vr = runner's velocity = 15 ft/s
g = 32.2 ft/s^2

You have one unknown, vo. Solve for it (using the quadratic formula) and use the answer to compute the distance thrown, vo^2/g.

To solve this problem, we need to break it down into two parts: the horizontal motion and the vertical motion.

First, let's consider the horizontal motion. Since the quarterback throws the ball at an angle of 45 degrees with the horizontal, the initial velocity of the ball can be split into horizontal and vertical components. The horizontal component will remain constant throughout the motion.

The horizontal velocity component (Vx) can be found using the formula:
Vx = vo * cos(45)

Given that the velocity vo is not specified in the question, we cannot determine the exact value of Vx. However, we can still proceed with solving the problem by considering it as a variable.

Now let's consider the vertical motion. The receiver starts running down the field at a constant speed of 15 ft/s. Since the ball is thrown and caught at the same height, the vertical component of its motion will follow a parabolic path under the effect of gravity.

To find the time it takes for the ball to reach the receiver, we can use the equation of motion:
y = yo + Vyo * t - (1/2) * g * t^2

In this case, since the initial vertical velocity (Vyo) is zero and the initial height (yo) is also zero, the equation simplifies to:
0 = - (1/2) * g * t^2

Solving for t, we find:
t = sqrt(0.5 * g)

Now, let's find the distance traveled by the receiver during this time. The receiver starts running 20 ft in front, so the distance traveled (D) can be calculated by multiplying their velocity (15 ft/s) with the time (t):
D = 15 * t

Finally, let's calculate the distance between the receiver and the quarterback when the ball is caught. This distance is the sum of the distance traveled by the receiver and the horizontal distance covered by the ball (Vx * t):
Total distance = D + Vx * t

By substituting the value of t from before, the equation becomes:
Total distance = D + Vx * sqrt(0.5 * g)

Remember that Vx was defined as the horizontal component of the initial velocity (vo * cos(45)). So the final equation is:
Total distance = D + (vo * cos(45)) * sqrt(0.5 * g)

This equation gives the distance between the receiver and the quarterback when the ball is caught. However, since the initial velocity (vo) was not specified, we cannot determine the exact value of the total distance without that information.