Final answer:
To find the time it takes for a ball thrown vertically upwards to land, we solve the quadratic equation h(t) = 20t - 5t² by setting h to zero. The positive root of the equation gives a landing time of 4 seconds.
Step-by-step explanation:
To determine how long it takes for a ball thrown vertically upward with an initial velocity of 20 meters per second to land, we look at the vertical motion described by the equation h(t) = 20t - 5t². The ball lands when the height h is equal to 0 meters. By setting the equation equal to 0 and solving for t, we get a quadratic equation that reflects the height of the ball at any given time. To find the roots of this equation we would typically use the quadratic formula, which says that the time t can be found using t = (-b ± √(b² - 4ac)) / (2a) where a, b, and c are coefficients from the equation at² + bt + c = 0.
In our case, a = -5, b = 20, and c = 0, therefore the quadratic formula simplifies to t = 20/10, giving us two possible times when the ball is at ground level (height = 0). Since we are looking for the time when the ball lands after being thrown, we choose the positive root, which is t = 4 seconds. The negative root would represent the time before the ball was thrown if we were to extrapolate the motion backwards in time, which is not meaningful in this context.
Therefore, it takes 4 seconds for the ball to return to the ground.