an arrow which has an initial speed of 40m/s is aimed at a target which is level with it at a distance of 100m from the point of projection. find the least time of flight for the arrow to hit the target.

Divide the distance by the speed

To find the least time of flight for the arrow to hit the target, you can use the kinematic equation for time of flight:

Time = Distance / Velocity

In this case, the distance is 100m and the velocity is the initial speed of the arrow, which is 40m/s.

Time = 100m / 40m/s
Time = 2.5s

Therefore, the least time of flight for the arrow to hit the target is 2.5 seconds.

To solve this problem, we need to find the time of flight for the arrow to hit the target. The time of flight can be determined using the kinematic equation:

\[d = v_0t + \frac{1}{2}at^2\]

Where:
- \(d\) is the distance traveled by the arrow (100m)
- \(v_0\) is the initial velocity of the arrow (40m/s)
- \(t\) is the time of flight (unknown in this case)
- \(a\) is the acceleration (which can be assumed to be 0, since the arrow is not affected by air resistance)

Since the acceleration is 0, the equation becomes:

\[d = v_0t\]

Rearranging the equation to solve for time, we get:

\[t = \frac{d}{v_0}\]

Now we substitute the given values into the equation:

\[t = \frac{100m}{40m/s}\]

Simplifying the equation:

\[t = 2.5s\]

Therefore, the least time of flight for the arrow to hit the target is 2.5 seconds.