Next time, please post in the physics section. It is not exactly a math problem because it requires knowledge of kinematics equations, assumptions and constants.
Given:
A rocket fired (assumed vertically upwards) at an initial velocity of +123 ft./s (positive upwards) against acceleration due to gravity of (assumed -32.2 ft/s^2) at an initial height of +4 feet above the ground. Need to find the time it takes before rocket reaches ground (assuming air resistance negligible) due to free fall.
Kinematics equations:
Using given numerical information
u=inivitial velocity=+123 ft/s
g=acceleration due to gravity= -32.2 ft/s^2
s0=initial location, height=+4 ft
s1=final local, height = 0 ft (ground)
We make use of the kinematics equation
s1=s0+u*t+at^2.............................(1)
where a=acceleration = g = -32.2 ft/s^2
Subsitute values in (1)
0=4+123*t-32.2t^2
and solve for t using the quadratic formula (knowing that t>0)
t=[-123 ± sqrt(123^2-4*(-32.2)*4)]/(2*-32.2) =>
t = -0.03225 sec. or t=3.852 sec.
Reject negative root, so the rocket will hit the ground 3.852 seconds since airborne.
Note: since #61 and #62 are pale in colour, it will be assumed that you can handle those two, or they will be the subject of other questions. Thank you.