A driver in a car traveling at a speed of 21.8m/s see a cat 101m away on the road. How long will it take for the car to accelerate uniformly to a stop in exactly 90m?

Vf^2=Vi^2+2ad

solve for a
then, Vf=Vi - a t solve for t.

4.56

To solve this problem, we can use the equation of motion for uniformly accelerated motion:

\[v_f^2 = v_i^2 + 2ax\]

where:
- \(v_f\) is the final velocity (which is 0 m/s when the car comes to a stop)
- \(v_i\) is the initial velocity (21.8 m/s in this case)
- \(a\) is the acceleration
- \(x\) is the displacement covered during the deceleration (90 m in this case)

Rearranging the equation, we get:

\[a = \frac{{v_f^2 - v_i^2}}{{2x}}\]

Since \(v_f = 0\) m/s, we can simplify the equation to:

\[a = \frac{{-v_i^2}}{{2x}}\]

Plugging in the given values, we have:

\[a = \frac{{-(21.8 \, \text{m/s})^2}}{{2 \times 90 \, \text{m}}}\]

Solving this equation will give us the acceleration of the car.