Final answer:
The difference in time, rounded to the nearest tenth, between when object 1 and object 2 hit the ground is 1.1 seconds.
Step-by-step explanation:
The two objects, object 1 and object 2, are both dropped from the top of a 50.0m building. Object 1 is dropped with an initial velocity of 0m/s, while object 2 is thrown straight downward with an initial velocity of 13.0m/s. In both cases, the only force acting on the objects is gravity, which causes them to accelerate downwards at a rate of 9.8m/s².
Since the initial velocities and heights are the same, the difference in time it takes for the two objects to hit the ground is due to the different initial velocities of object 1 and object 2. To find the difference in time, we need to calculate the time it takes for each object to fall to the ground.
Using the kinematic equation, h = (1/2)gt², where h is the height, g is the acceleration due to gravity, and t is the time, we can solve for t by substituting the values:
For object 1: h = 50.0m and v₀ = 0m/s
h = (1/2)gt², 50.0 = (1/2)(9.8)t², 100 = 9.8t², t² = 100/9.8, t ≈ 3.2s
For object 2: h = 50.0m and v₀ = 13.0m/s
h = (1/2)gt² + v₀t, 50.0 = (1/2)(9.8)t² + 13.0t, 100 = 9.8t² + 26t, 9.8t² + 26t - 100 = 0
Using the quadratic formula, t ≈ 2.1s
Therefore, the difference in time when the two objects hit the ground is approximately 3.2s - 2.1s = 1.1s