189k views
1 vote
A driver of a car in motion sees a deer on the road. The driver hits the brakes which provide an acceleration of -7.0 m/sec^2. If the car’s initial velocity was 25 m/sec, how far will the car travel before coming to a stop?

1 Answer

6 votes

Under constant acceleration, we have


v^2-{v_0}^2 = 2ax

where
v is final velocity,
v_0 is initial velocity,
a is acceleration, and
x is distance.

Solving for
x gives


x = \frac{v^2-{v_0}^2}{2a}

Then if the car starts with speed 25 m/s and acceleration -7.0 m/s², and comes to a rest (so that
v=0), we have


x = (-\left(25(\rm m)/(\rm s)\right)^2)/(2\left(-7.0(\rm m)/(\mathrm s^2)\right)) \approx \boxed{45\,\mathrm m}

User Kirti Zare
by
4.3k points