an arrow is fired horizontally at a target 30 meters away. The bullseye is at the same level as the bow. The initial speed of the arrow is 50 m/s. How long does it take to get to the target? How far below the bullseye does it fall?

The horizontal velocity component remains 50 m/s. Divide the distance by that speed to get the time it takes to get to the target, t.

For the distance the arrow falls en route, use
y = (1/2) g t^2

To find the time it takes for the arrow to reach the target, we can use the formula:

time = distance / velocity

In this case, the distance is 30 meters and the velocity is 50 m/s. Plugging these values into the formula, we get:

time = 30 meters / 50 m/s
time ≈ 0.6 seconds

So, it takes approximately 0.6 seconds for the arrow to reach the target.

Now, to find how far below the bullseye the arrow falls, we need to calculate the vertical distance traveled by the arrow during this time.

The vertical distance can be found using the formula:

distance = (1/2) * g * time^2

Where:
- "g" is the acceleration due to gravity, which is approximately 9.8 m/s^2.
- "time" is the time taken for the arrow to reach the target, which we found to be 0.6 seconds.

Plugging these values into the formula, we get:

distance = (1/2) * 9.8 m/s^2 * (0.6 s)^2
distance ≈ 1.764 meters

So, the arrow falls approximately 1.764 meters below the bullseye.