physics
posted by Taylor .
A bullet is fired from a rifle that is held 4.6 m above the ground in a horizontal position. The initial speed of the bullet is 1130 m/s. There is no error in the velocity value.
(a) Find the time it takes for the bullet to strike the ground.
(b) Find the horizontal distance traveled by the bullet.
(c) If the error in time measurements is Δt, what will be the equation to calculate error in the horizontal distance (Δx)? (Assume x is in meters and t is in seconds. Do not enter units in your expression. Substitute numeric values; the only variable you should enter is Δt.)

Plug the values you have into the formula y = y[initial] + 1/2gt^2
Remember 1/2g is a negative value!
y[initial] is going to be the height the bullet starts from, or rather the height at which the gun is held. g is the standard gravitational acceleration. Y will just be zero, since the final height of the bullet will be the ground, or 0.
Next, solve! Subtract y[initial] from y, and then divide by 1/2g. This will equal t^2. Take the square root and viola!
For the second part of the question, use the distance formula: d = vt + 1/2at^2.
Once again, a will be the gravitational acceleration, and negative!
Not sure about the last one....
Hope it helps! 
This is wrong!

This is right, just plug the numbers in