An archer shoots an arrow at a 72.0 m distant target, the bull's-eye of which is at same height as the release height of the arrow.

(a) At what angle must the arrow be released to hit the bull's-eye if its initial speed is 36.0 m/s? (Although neglected here, the atmosphere provides significant lift to real arrows.)
Help help help ?!

time in air: distance/speed=72/36=2s

hf=hi-vsinTheta*t-4.9t^2

hf=hi and you know time t, solve for theta

To answer this question, we can use the equations of projectile motion. The two key equations we'll need are:

1. Range equation: R = (v^2 * sin(2θ)) / g
2. Vertical motion equation: h = (v^2 * sin^2(θ)) / (2g)

where:
- R is the range (horizontal distance) from the archer to the target (72.0 m in this case)
- v is the initial speed of the arrow (36.0 m/s in this case)
- θ is the angle at which the arrow is released
- g is the acceleration due to gravity (9.8 m/s^2)

We need to solve for θ in the range equation. Let's rearrange the equation to isolate θ:

R = (v^2 * sin(2θ)) / g

First, let's simplify the equation by substituting v = 36.0 m/s and g = 9.8 m/s^2:

72.0 = (36.0^2 * sin(2θ)) / 9.8

Now, let's solve for sin(2θ):

sin(2θ) = (72.0 * 9.8) / (36.0^2)

sin(2θ) = 0.196

To find the angle θ, we need to take the inverse sine (sin^(-1)) of both sides:

2θ = sin^(-1)(0.196)

Divide both sides by 2:

θ = (1/2) * sin^(-1)(0.196)

θ ≈ 11.17 degrees

Therefore, the arrow must be released at an angle of approximately 11.17 degrees to hit the bull's-eye at a distance of 72.0 m with an initial speed of 36.0 m/s.