If a bomber travels horizontally at a speed of 250 km/hr and at a height of 5000 m how far away (measured horizontally) from the target must a bomb be released?

how long does it take for the bomb to fall 5000m?

hf=hi-4.9t^2

t= sqrt(5000/4.9) seconds

how far away? speed*time

change km/hr to m/s

To find the horizontal distance from the target where the bomb must be released, we can use the concept of time and velocity. First, we need to find the time it takes for the bomb to reach the target.

The horizontal distance traveled by the bomb is the product of the horizontal speed (250 km/hr) and the time it takes to reach the target. Let's convert the speed to m/s for consistency:

250 km/hr = (250 * 1000) m / (60 * 60) sec ≈ 69.44 m/s

Now, the time it takes for the bomb to reach the target can be calculated by dividing the height (5000 m) by the vertical speed of the bomber. Since the vertical speed is not given, we assume the bomber travels at a constant altitude, which means there is no change in vertical position. Therefore, the value for the vertical speed is zero.

Time = Vertical distance / Vertical speed
Time = 5000 m / 0 m/s = Undefined

Since the vertical speed is zero, the bomb will reach the target instantaneously. Hence, the horizontal distance at which the bomb must be released is also the horizontal distance between the bomber and the target, which can be calculated using the horizontal speed and the time it takes to reach the target.

Distance = Speed * Time
Distance = 69.44 m/s * Undefined = Undefined

Therefore, without the knowledge of the vertical speed of the bomber, it is not possible to determine the exact horizontal distance from the target where the bomb should be released.