202k views
1 vote
A baseball player notices the ball when it is 3.4 m above the

ground, traveling at 4.4 m/s. He wants to make the catch when
the ball is 1.5 m above the ground, how long does it take to reach
his glove?

User Jabirali
by
4.2k points

2 Answers

5 votes

Answer:

Step-by-step explanation:

s = s₀ + v₀t + ½at²

There are an infinite number of solutions to this question as posed because we are not told the direction of the initial velocity.

Assuming ground is level and origin and UP the positive direction

The shortest amount of time possible is when the initial velocity is straight down

1.5 = 3.4 - 4.4t + ½(-9.8)t²

0 = -4.9t² - 4.4t + 1.9

t = (4.4 ±√(4.4² - 4(-4.9)(1.9))) / (2(-4.9))

positive answer is

t = 0.32 s

The longest amount of time possible is when the initial velocity is straight up.

1.5 = 3.4 + 4.4t + ½(-9.8)t²

0 = -4.9t² + 4.4t + 1.9

t = (-4.4 ±√(4.4² - 4(-4.9)(1.9))) / (2(-4.9))

positive answer

t = 1.22 s

If the initial velocity is horizontal, meaning no vertical velocity

1.5 = 3.4 + 0t + ½(-9.8)t²

-4.9t² = -1.9

t² = 0.38775...

t = 0.62 s

Any angle between UP and Down will have a different initial vertical velocity and result in a different time to catch height.

It appears from the comments on the other answer, that I have shown you how to arrive at three of the four possible solutions. The initial direction is very important.

User Diahann
by
4.5k points
3 votes

Find the distance the ball travels:

3.4 meters - 1.5 meters = 1.9 meters

Now divide the distance the ball travels by the speed:

1.9 meters / 4.4 m/s = 0.43 seconds

User Paul Lemarchand
by
4.8k points