a tiger leaps horizontally from a 15 m high rock with a speed of 7.0m/s how far from the base of the rock will it land

To solve this problem, we can use the kinematic equation for horizontal motion:

\[d = v_x \times t\]

Where:
- \(d\) is the distance from the base of the rock that the tiger will land
- \(v_x\) is the horizontal component of the velocity (since there is no horizontal acceleration, the velocity remains constant)
- \(t\) is the time of flight

First, we need to find the time of flight. This can be done using the kinematic equation for vertical motion:

\[y = y_0 + v_{oy} \times t + \frac{1}{2} \times a \times t^2\]

Where:
- \(y\) is the final vertical position (0 m, since the tiger lands on the ground)
- \(y_0\) is the initial vertical position (15 m)
- \(v_{oy}\) is the vertical component of the initial velocity (0 m/s, since the tiger leaps horizontally)
- \(a\) is the acceleration due to gravity (-9.8 m/s^2)

Plugging in the values, we can find the time of flight:

\[0 = 15 + 0 \times t + \frac{1}{2} \times (-9.8) \times t^2\]
\[9.8t^2 = 15\]
\[t = \sqrt{\frac{15}{9.8}} \approx 1.75 s\]

Now, we can find the horizontal distance:

\[d = 7.0 \times 1.75\]
\[d \approx 12.25 m\]

Therefore, the tiger will land approximately 12.25 meters from the base of the rock.