101k views
3 votes
ANSWER ASAP--Suppose an astronaut can jump vertically with an initial velocity of 5 m/s. The time that it takes him to touch the ground is given by the equation 0 = 5t - 0.5at2. The time t is in seconds and the acceleration due to gravity a is in m/s2. How long will it take him to reach the ground if he jumps on Earth where a = 9.8 m/s2?

User Sthenault
by
7.9k points

1 Answer

6 votes

Answer:

It will take the astronaut 1.02 seconds to reach the ground.

Explanation:

If
a = 9.8m/s^2, then our equation becomes


5t-0.5(9.8)t^2=0


5t-4.9t^2=0

Add
4.9t^2 to both sides and get


5t = 4.9t^2,

and dividing both sides by
t, we get:


5 = 4.9t


$t = (5)/(4.9) $


\boxed{t = 1.02s}

Therefore, it takes the astronaut 1.02 seconds to reach the ground.

User Adam Driscoll
by
8.0k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories