A tiger leaps horizontally from a 15.0 m high rock with a speed of 7.10 m/s. How far (in meters) from the base of the rock will it land?

so using the equation h=1/2 gt^2
and i put in 15 for h and g is 9.8 so i solve for t. is this right and when do i use the 7.10 m/s

The equation h = 1/2 gt^2 you mentioned is not the correct equation to solve this problem. This equation is used to determine the height of an object in free fall at a certain time t.

To solve for the distance the tiger will land from the base of the rock, you need to use the equations of projectile motion.

The key equations you need to use are:

1. For vertical motion:
- vf = vi + gt
- h = vit + 1/2 gt^2

2. For horizontal motion:
- d = v * t

Here's how you can solve the problem step by step:

1. First, calculate the time it takes for the tiger to hit the ground using the equation for vertical motion. Since the initial vertical velocity (vi) is 0 m/s (as the tiger starts from rest on top of the rock) and the acceleration due to gravity (g) is -9.8 m/s^2 (taking negative value because it acts in the opposite direction), you can use the equation vf = vi + gt to find the final vertical velocity (vf) when the tiger lands. Solve for t.

- vf = vi + gt
- 0 = 0 + (9.8) * t
- t = 0 seconds

The negative value for time tells us that the tiger would take no time to hit the ground vertically, as gravity acts immediately downward.

2. Next, use the equation for horizontal motion to find the distance (d) the tiger will land from the base of the rock. Since the horizontal velocity (v) is 7.10 m/s and the time (t) is 0 seconds (as we found in step 1), you can use the equation d = v * t.

- d = (7.10) * (0)
- d = 0 meters

Therefore, the tiger will land exactly at the base of the rock, which is 0 meters from the base.