A rifle is pointed horizantally 100 meters away from a target. The bullet hits 0.75 millimeters below the target. How long did it take to get there? What was the velocity of the bullet when it was first shot?

To find the time it took for the bullet to reach the target and the initial velocity of the bullet, we can use the equations of motion. Let's start by considering the vertical motion of the bullet.

Given:
- Distance traveled horizontally (range), d = 100 meters.
- Vertical displacement (drop), y = 0.75 millimeters = 0.00075 meters.

First, let's find the time it took for the bullet to travel horizontally:

The horizontal motion is not affected by gravity, so the time of flight is given by the equation:
time = distance / velocity.

Since the distance traveled horizontally is 100 meters, and we don't know the velocity yet, we can rearrange the equation:
velocity = distance / time.

Next, let's consider the vertical motion:

The bullet experiences free fall motion due to gravity, and its vertical displacement is given by the equation:
y = (1/2) * g * t^2,
where g is the acceleration due to gravity (approximately 9.8 m/s^2), and t is the time of flight.

We know the vertical displacement is 0.00075 meters, so we can solve for t:
t = sqrt((2 * y) / g).

With the value of t, we can substitute it back into the equation for velocity to find the initial velocity:

velocity = distance / time.

Now, let's plug in the values and calculate the results:

Step 1: Calculate the vertical time of flight (t):
t = sqrt((2 * 0.00075) / 9.8) ≈ 0.01228 seconds.

Step 2: Calculate the velocity:
velocity = distance / time = 100 / 0.01228 ≈ 8148.19 m/s.

Therefore, the bullet took approximately 0.01228 seconds to reach the target, and its initial velocity was approximately 8148.19 m/s.