193k views
3 votes
Two identical items, object 1 and object 2, are dropped

from the top of a 50.0m building. Object 1 is dropped
with an initial velocity of 0m/s, while object 2 is thrown
straight downward with an initial velocity of 13.0m/s.
What is the difference in time, in seconds rounded to
the nearest tenth, between when the two objects hit the
ground?
a. Object 1 will hit the ground 3.2 s after object 2.
b. Object 1 will hit the ground 2.1 s after object 2.
c. Object 1 will hit the ground at the same time as
object 2.
d. Object 1 will hit the ground 1.1 s after object 2.

1 Answer

2 votes

Final answer:

Object 1 will hit the ground 1.1 seconds after object 2. To determine the difference in time between when the two objects hit the ground, we need to find the time it takes for each object to reach the ground.

Step-by-step explanation:

For object 1, since it is dropped from rest, we can use the equation h = 0.5gt^2, where h is the initial height of 50.0m and g is the acceleration due to gravity.

Rearranging the equation, we get t = sqrt(2h/g).

Plugging in the values,

t = sqrt(2*50.0m/9.8m/s^2)

= 3.19s (rounded to the nearest tenth).

For object 2, it is thrown straight downward with an initial velocity of 13.0m/s.

We can use the equation h = 13.0t + 0.5gt^2.

Rearranging the equation, we get t = (-13.0 + sqrt(13.0^2 - 4*-4.9*50.0))/(2*-4.9).

Plugging in the values, t = 2.03s (rounded to the nearest tenth).

The difference in time between when the two objects hit the ground is 3.19s - 2.03s = 1.16s (rounded to the nearest tenth).

Therefore, option d is correct: Object 1 will hit the ground 1.1s after object 2.

User Tchypp
by
7.8k points