427,056 views
31 votes
31 votes
how far from the base of the cliff would the driver have landed if the driver initially had been moving horizontally with speed 10 m/s?

how far from the base of the cliff would the driver have landed if the driver initially-example-1
User Lewis Norton
by
2.5k points

1 Answer

19 votes
19 votes

Answer:

C

Step-by-step explanation:

y0=100 m

y=0

x0=0

x=?

v0x=10 m/s (moving horizontally)

v0y=0

x=x0+v0xt

x=10t

y=y0+1/2at^2

solve for t

0=y0-1/2gt^2

(a is acceleration which is - gravitational acceleration )

t=
\sqrt{(2y0)/(g) }

plug into x equation

x=10(
\sqrt{(2y0)/(g) }) where y0=100m and g=9.80 m/s^2

x= 45.2 m

if you check the distance when v0=5 m/s, we get x=22.5 m so the equation is correct. however if the g your class uses is 10.00 m/s^2, then we get x=22.4 for the first velocity and 44.7 for the second velocity

User Metaforge
by
3.1k points