A golf ball is hit with an initial velocity of 50 m/s at an angle of 45 degrees with the horizontal. How far will it travel horizontally before it hits the ground?

For initial velocity V and a launch angle A, the horizontal distance travelled is

X = 2 sin A cos A (V^2/g)
=(V^2/g) sin 2A
In this case, sin 2A = 1.

To solve this problem, we need to break down the initial velocity into its horizontal and vertical components.

The horizontal component of the initial velocity is given by: Vx = V * cos(θ), where V is the magnitude of the initial velocity and θ is the angle with the horizontal.
So, Vx = 50 m/s * cos(45°) = 50 m/s * √2 / 2 = 25√2 m/s.

The vertical component of the initial velocity is given by: Vy = V * sin(θ), where V is the magnitude of the initial velocity and θ is the angle with the horizontal.
So, Vy = 50 m/s * sin(45°) = 50 m/s * √2 / 2 = 25√2 m/s.

Now, we need to determine the time it takes for the ball to hit the ground. We can use the equation: Vy = gt, where g is the acceleration due to gravity (approximately 9.8 m/s^2) and t is the time of flight.
So, 25√2 m/s = 9.8 m/s^2 * t.

Solving for t, we get: t = (25√2 m/s) / (9.8 m/s^2) ≈ 2.55 seconds.

Finally, to find the horizontal distance traveled by the golf ball, we can use the formula: Dx = Vx * t, where Vx is the horizontal component of the initial velocity and t is the time of flight.
So, Dx = 25√2 m/s * 2.55 s ≈ 89.4 meters.

Therefore, the golf ball will travel approximately 89.4 meters horizontally before it hits the ground.