A truck is travelling along a straight road at a constant velocity of 50ms^-1 and sees a dog 200-metres away. In an attempt to avoid an accident, the driver has to brake to a sudden stop.

Once the brakes are applied, the truck now has an acceleration of -9.0ms-2. How far does the truck travel when coming to a stop after the driver applies the brakes?

v = 50-9t, so it takes 5 seconds to stop

s = 50t - 4.5t^2
So, plug in t=5 and discover the fate of the dog, who has stupidly stood there for 5 seconds in front of an oncoming truck...

oops. t = 5.55 seconds

But I'm sure you caught that ...

What total distance does the truck travel from when the driver notices the danger to when it comes to stop? does the truck hit the dog?

since the acceleration is constant you can use the average speed

25 m/s for the 5.56 seconds
5.56 * 25 = 139 meters

V^2 = Vo^2 + 2a*d = 0.

50^2 + (-18)*d = 0,
d =

To find the distance the truck travels when coming to a stop after the driver applies the brakes, we can use the equations of motion. The equation that relates distance (d), initial velocity (u), final velocity (v), and acceleration (a) is:

v^2 = u^2 + 2ad

In this case, the truck initially has a velocity (u) of 50 m/s, the final velocity (v) is 0 m/s since it comes to a stop, and the acceleration (a) is -9.0 m/s^2 (negative since it's in the opposite direction to the motion).

Plugging in the values into the equation, we get:

0^2 = 50^2 + 2(-9.0) * d

0 = 2500 - 18d

18d = 2500

d = 2500 / 18

d ≈ 138.89 meters

Therefore, the truck travels approximately 138.89 meters when coming to a stop after the driver applies the brakes.