If a ball is hit with an initial velocity of 110 feet per second at an angle of 45 degrees from an initial height of 2 feet, how far will the ball travel before it hits the ground?

just solve for t in

h(t) = 2 + (110 sin45°)t - 16t^2 = 0

then use that t and multiply by the constant horizontal speed of

110 cos45°

To find the distance the ball will travel before it hits the ground, we can use the equation of motion for projectile motion:

d = (v^2 * sin(2θ)) / g

Where:
- d is the distance traveled by the ball
- v is the initial velocity of the ball
- θ is the launch angle (45 degrees in this case)
- g is the acceleration due to gravity (approximately 32.2 feet per second squared)

First, let's calculate the value of v^2 * sin(2θ):

v^2 * sin(2θ) = (110^2) * sin(2 * 45)
= 12100 * sin(90)
= 12100 * 1
= 12100

Now, we can substitute this value into the equation and solve for the distance traveled:

d = (v^2 * sin(2θ)) / g
d = 12100 / 32.2
d ≈ 375.77 feet

Therefore, the ball will travel approximately 375.77 feet before it hits the ground.