If an arrow was shot at such an angle that for every second of travel a lateral change of 50 feet away from the archer was gained, how far away does the arrow land from cliff base?

3 ft 2 inches

To determine how far away the arrow lands from the cliff base, we need to calculate the horizontal distance it travels.

Let's break down the information provided:
- For every second of travel, the arrow gains a lateral change of 50 feet away from the archer.
- We are assuming that "lateral change" refers to the horizontal distance traveled by the arrow.

We can use the equation for horizontal distance, which is given by the product of the time (in seconds) and the lateral change (in feet).

To calculate the time it takes for the arrow to land, we need to consider the vertical motion of the arrow. If we assume that the arrow follows a parabolic trajectory, we can apply the principles of projectile motion to find the time.

In this case, we need additional information about the vertical motion, such as the initial velocity and launch angle of the arrow, as well as the height of the cliff. Without this information, we cannot determine the time it takes for the arrow to land accurately.

Once we have the time, we can multiply it by the given lateral change of 50 feet per second to find the horizontal distance traveled.