Final answer:
Object 1 will hit the ground 1.1 seconds after object 2. To determine the difference in time between when the two objects hit the ground, we need to find the time it takes for each object to reach the ground.
Step-by-step explanation:
For object 1, since it is dropped from rest, we can use the equation h = 0.5gt^2, where h is the initial height of 50.0m and g is the acceleration due to gravity.
Rearranging the equation, we get t = sqrt(2h/g).
Plugging in the values,
t = sqrt(2*50.0m/9.8m/s^2)
= 3.19s (rounded to the nearest tenth).
For object 2, it is thrown straight downward with an initial velocity of 13.0m/s.
We can use the equation h = 13.0t + 0.5gt^2.
Rearranging the equation, we get t = (-13.0 + sqrt(13.0^2 - 4*-4.9*50.0))/(2*-4.9).
Plugging in the values, t = 2.03s (rounded to the nearest tenth).
The difference in time between when the two objects hit the ground is 3.19s - 2.03s = 1.16s (rounded to the nearest tenth).
Therefore, option d is correct: Object 1 will hit the ground 1.1s after object 2.