60.5k views
2 votes
A ball is thrown upward in the air with an initial velocity of 40 m/s. How long does it take to

reach back to the point it was thrown from?

User Ingmar
by
6.5k points

1 Answer

0 votes

Answer:

You need the definition of acceleration (a=Vf-Vi/t) and 1 equation of linear motion (deltaX = Vi×t + 1/2×a×t^2). Since you know a is constant (gravity) and you know your initial Vi to be 40 m/s and your final velocity Vf to be zero (maximum height), then you can use thhe definition of acceleration to find time.

-9.81m/s^2 = (0-40m/s)/t

t = (-40)/(-9.81) s

t = 4.077s

Now that you have time, you should know all but deltaX in the equation of linear motion.

dX = (40m/s)(4.077s) + (1/2)(-9.81m/s^2)(4.077s)^2

dX = (163.099m) — (81.549m)

dX = 81.55m

User Gohan
by
5.7k points