A missile is fired with a launch velocity of 15000 ft/s at a target 1200 miles away. How long after it is fired will the target be hit? Use g = 32 ft/s.s Express your answer in seconds.

http://www.jiskha.com/display.cgi?id=1457748813

To find the time it takes for the missile to hit the target, we need to use the equation of motion for vertical motion:

s = ut + 0.5gt^2

Where:
s = vertical displacement (in this case, 0, since we're considering only horizontal motion)
u = initial velocity (launch velocity of the missile)
t = time taken
g = acceleration due to gravity (32 ft/s^2)

In this scenario, the vertical displacement (s) is 0 since the target is on the same horizontal level as the missile. The initial velocity (u) is given as 15000 ft/s. The acceleration due to gravity (g) remains constant at 32 ft/s^2.

Let's solve the equation for time (t):

0 = ut + 0.5gt^2

Since the vertical displacement is 0, the equation simplifies to:

0 = 15000t + 0.5 * 32 * t^2

Now we can solve this quadratic equation to get the values of t. Rearranging the equation, we have:

16t^2 + 15000t = 0

Factoring out t, we get:

t(16t + 15000) = 0

Setting each factor equal to zero:

t = 0 or 16t + 15000 = 0

The solution t = 0 is not considered because it represents the initial time when the missile is fired.

Solving for t in the equation 16t + 15000 = 0, we find:

16t = -15000
t = -15000/16
t = -937.5 seconds

Since time cannot be negative, we disregard the negative value. Therefore, the missile will hit the target in approximately 937.5 seconds (or 937.5/60 ≈ 15.63 minutes).