posted by Cathy .
Problem: A hunter shoots an arrow at a deer directly away from him. When the arrow leaves the bow, the deer is at a distance of 38 m. When the arrow strikes, the deer is at a distance of 51.00 m. The speed of the arrow is 65.00 m/s. What must have been the speed of the deer? How long did the arrow take to travel to the deer?
My teacher said to calculate the time of flight to get how long the arrow traveled for to get to the deer, but I don't angle of the arrow (unless its 180 degrees).
Formula:t = (65 m/s * sin (180))/(9.81 m/s^2)
If you don't shoot the arrow at an angle,it will NOT hit the deer, but hit the ground.
The horizontal speed of the arrow is 65cosTheta
The vertical speed is 65sinTheta.
Horizontal distance= horizvelocity*time
or time= 51/65cosTheta
Now put that time here:
Hfinal=Hinitial+ Vi*t -1/2 g t^2
0=65sintheta *t - 4.9t^2
Now put the expression for t you got above, and solve for theta (it is a little messy, and you get two solutions).
Then go back and solve for time.