a object is traveling horizontaly when it reaches a 50meter drop how fast must the object be traveling to land 90 meters away from the edge of the drop

To calculate the speed at which the object must be traveling to land 90 meters away from the edge of the drop, we can use basic kinematic equations. Let's break down the steps:

1. Identify the known variables:
- Initial horizontal velocity (Vx): This is the speed at which the object is traveling horizontally before reaching the drop.
- Vertical distance (h): 50 meters, which is the height of the drop.
- Horizontal distance (d): 90 meters, which is the distance at which the object needs to land from the edge of the drop.
- Acceleration due to gravity (g): Approximately 9.8 m/s², assuming Earth's gravity.

2. We can calculate the time it takes for the object to fall from the drop using the equation for vertical motion: h = (1/2) * g * t².
Plugging in the known values, we have:
50 = (1/2) * 9.8 * t²
t² = (50 * 2) / 9.8
t² ≈ 10.2
t ≈ √10.2
t ≈ 3.19 seconds (approximately).

3. Now, we can find the horizontal distance traveled by the object during the 3.19 seconds it takes to fall. We'll use the equation: d = Vx * t.
Plugging in the known values, we have:
90 = Vx * 3.19
Vx = 90 / 3.19
Vx ≈ 28.21 m/s.

Therefore, the object must be traveling at approximately 28.21 m/s horizontally before reaching the drop to land 90 meters away from the edge.