110k views
5 votes
A baseball is thrown upward at 20m/s. At what time is the ball 10m above the point at which it was released?​

1 Answer

6 votes

Answer:

0.583 seconds

Step-by-step explanation:

We are dealing with free-fall motion, which has a constant acceleration of -g, acceleration due to gravity, so we can use a constant acceleration equation to solve this problem.

List out known variables and the variable that we want to solve for:

  • v_0 = 20 m/s
  • a = -9.8 m/s²
  • Δx = 10 m
  • t = ?

Find the kinematic equation that contains all four of these variables:

  • Δx = v_0 t + 1/2at²

Plug known values into the equation.

  • 10 = 20 t + 1/2(-9.8)t²

Distribute 1/2 inside the parentheses.

  • 10 = 20t - 4.9t²

Set the equation equal to 0 and rearrange the terms so they are in descending degree order.

  • 0 = -4.9t² + 20t - 10

Factor this equation by using the quadratic formula. You should get:

  • t = 3.498 s, t = 0.583 s

Let's look at these numbers within the context of the problem. If the ball's speed is 20 m/s, then after 1 second, it would reach a height of 20 m.

However, we want to find the time that it would take the ball to reach a height of 10 m, which is half of 20 m. Therefore, the time to reach 10 m should be around 0.5 seconds.

t = 0.583 is the closest to 0.5 seconds, and it fits within the context of the problem. Therefore, the answer is 0.583 seconds.

User Guian
by
5.1k points