A tiger leaps horizontally from a 5.8m -high rock with a speed of 3.6 m/s.

How far from the base of the rock will she land?

To find the horizontal distance that the tiger will cover when leaping from the rock, we need to use the equation of motion:

s = u*t + (1/2)*a*t^2

Where:
s is the horizontal distance covered
u is the initial horizontal velocity (which is constant since there is no horizontal acceleration)
t is the time of flight
a is the horizontal acceleration (which is zero since there is no horizontal force acting on the tiger)

In this case, we know that the initial horizontal velocity is 3.6 m/s and the time of flight can be determined by using the vertical motion of the tiger.

The vertical motion of the tiger can be analyzed using the equation of motion:

s = u*t + (1/2)*a*t^2

Where:
s is the vertical distance covered, which is equal to the height of the rock (5.8m)
u is the initial vertical velocity (which can be taken as zero since the tiger starts from rest)
t is the time of flight
a is the acceleration due to gravity, which is approximately 9.8 m/s^2

By plugging in the values, we can solve for the time of flight (t) in the vertical motion equation:

5.8m = (1/2)*9.8m/s^2*t^2

Rearranging the equation:

4.9t^2 = 5.8m

Dividing both sides by 4.9:

t^2 = 5.8m / 4.9

Taking the square root of both sides:

t = sqrt(1.18s) (approximately)

Now that we have the time of flight (t) in the vertical motion, we can calculate the horizontal distance covered using the horizontal motion equation:

s = u*t

Substituting the values:

s = 3.6m/s * sqrt(1.18s)

Using a calculator, we can calculate the square root of 1.18 and multiply it by 3.6:

s ≈ 3.195m

Therefore, the tiger will land approximately 3.195 meters from the base of the rock.