I'm very confused.

What is the range of an arrow shot horizontally at 85.3 m/s if it is initially 1.5m above the ground?

I'm guessing 1.5 is delta Y and initial velocity at 85.3 m/s and final velocity at 0 m/s. I just don't understand how I'm supposed to get the answer.

YOu have to break it into vertical and horizonal components.

Use the vertical to find how long it is in the air:
yfinal=yinitial+vyinitial*time+g*time
0=1.5 + O*time-9.8*time
time in air=1.5/9.8 seconds

Now the horizontal distance:
xfinal=xinitial + vx*time + a*t
xfinal=0 + 85.5*1.5/9.8 + 0
and you have it.

To find the range of an arrow shot horizontally, you can use the formula for horizontal distance:

Range = (Initial Velocity) x (Time of Flight)

In this case, the initial velocity is given as 85.3 m/s. Since the arrow is shot horizontally, the initial vertical velocity is 0 m/s. The time of flight is the time it takes for the arrow to reach the ground.

To calculate the time of flight, you can use the formula:

Time of Flight = (2 * Height) / (Acceleration due to gravity)

In this case, the height is 1.5 m and the acceleration due to gravity is approximately 9.8 m/s².

Substituting the values into the formula:

Time of Flight = (2 * 1.5 m) / (9.8 m/s²)
Time of Flight ≈ 0.306 seconds

Now that you have the time of flight, you can calculate the range:

Range = (Initial Velocity) x (Time of Flight)
Range = (85.3 m/s) x (0.306 s)
Range ≈ 26.1 meters

Therefore, the range of the arrow, when shot horizontally at an initial velocity of 85.3 m/s and initially 1.5 meters above the ground, is approximately 26.1 meters.

To find the range of the arrow shot horizontally, we need to determine the horizontal distance it travels before hitting the ground. Given that the arrow is shot horizontally, there is no vertical acceleration acting on it.

To solve this problem, we can use the equation:

Range = Velocity × Time,

where Velocity represents the horizontal component of the arrow's initial velocity, and Time is the total time the arrow takes to hit the ground.

Since the arrow is shot horizontally, the initial vertical velocity is zero. The only velocity component that matters here is the horizontal component, which is equal to the initial velocity of the arrow.

First, let's calculate the time it takes for the arrow to hit the ground. We can use the equation for vertical displacement with constant acceleration:

delta Y = (Initial velocity × Time) + (0.5 × Acceleration × Time^2),

where delta Y is the vertical displacement, Initial velocity is the initial vertical velocity, Acceleration is the acceleration due to gravity, and Time is the total time taken.

In this case, the initial vertical velocity is zero, the vertical displacement is -1.5 m (negative because the arrow is moving downwards), and the acceleration due to gravity is approximately 9.8 m/s^2. Since we want to find the time when the arrow reaches the ground, we can set delta Y to -1.5 m:

-1.5 = (0 × Time) + (0.5 × 9.8 × Time^2).

Now we can solve this equation to find the time taken for the arrow to hit the ground.

0.5 × 9.8 × Time^2 = 1.5,

4.9 × Time^2 = 1.5,

Time^2 = 1.5 / 4.9,

Time^2 = 0.306,

Time ≈ √0.306,

Time ≈ 0.553 seconds.

Now that we have the time taken for the arrow to hit the ground, we can calculate the range by multiplying the horizontal velocity by the time:

Range = Velocity × Time,

Range = 85.3 m/s × 0.553 s,

Range ≈ 47.19 meters.

So, the range of the arrow shot horizontally at 85.3 m/s, with an initial height of 1.5 meters, is approximately 47.19 meters.

This type of questin is sort of a trick question, because the horizontal velocity has no effect on the answer.

It is no different than dropping an object from 1.5m

Just calculate how long it takes an object to fall 1.5m due to gravity, that is your answer.