11.1k views
2 votes
Syyedah throws a baseball with an initial velocity of 30 meters per second from a roof at an initial height of 75 meters. Write an equation to model this scenario. How long does it take the ball to reach the ground? Use algebra to find your answer. (5 pts)

1 Answer

2 votes
Initial velocity: v o = 30 m/s
h = 75 m
h = v o * t + g * t² / 2
75 = 30 · t + 9.8 · t² / 2
75 = 30 t + 4.9 t²
An equation is: y = 4.9 t² + 30 t - 75
4.9 t² + 30 t - 75 = 0

t_(12)= (-30+/- √(900+1470) )/(9.8)= \\ =(-30+48.68)/(9.8)
t = 1.9 s


User Crftr
by
7.3k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.