A baseball is hit with a speed of 29.0{\rm m/s} at an angle of 44.0{\rm ^\circ} . It lands on the flat roof of a 10.0{\rm m} -tall nearby building. If the ball was hit when it was 1.4{\rm m} above the ground, what horizontal distance does it travel before it lands on the building?

To find the horizontal distance traveled by the baseball before it lands on the building, we can use the equations of projectile motion.

First, we need to find the time it takes for the ball to reach the roof of the building. We can use the vertical motion of the ball for this.

Given:
Initial vertical position (y₀) = 1.4 m
Vertical displacement (Δy) = height of the building = 10.0 m
Acceleration due to gravity (g) = 9.8 m/s² (assuming no air resistance)

Using the equation for vertical displacement:
Δy = v₀y * t + (1/2) * g * t²

Where:
v₀y = initial vertical component of velocity
t = time

We know that the initial vertical component of velocity can be found using the initial velocity and the launch angle:
v₀y = v₀ * sin(θ)

Given:
Initial speed (v₀) = 29.0 m/s
Launch angle (θ) = 44.0°

Using the equations for displacement and initial vertical velocity, we can rewrite the equation as:
10.0 m = (29.0 m/s * sin(44.0°)) * t + (1/2) * 9.8 m/s² * t²

Let's solve this equation to find the time (t).

Rearranging the equation, we get a quadratic equation in terms of t:
4.9 t² + (29.0 m/s * sin(44.0°)) * t - 10.0 m = 0

Solving this quadratic equation will give us the time it takes for the ball to reach the roof of the building.

Once we have the time, we can find the horizontal distance traveled by the ball using the equation:
Δx = v₀x * t

Where:
v₀x = initial horizontal component of velocity

The initial horizontal component of velocity can be found using the initial speed and the launch angle:
v₀x = v₀ * cos(θ)

Given:
Initial speed (v₀) = 29.0 m/s
Launch angle (θ) = 44.0°

Using this equation, we can find the horizontal distance traveled by the ball.