385,248 views
10 votes
10 votes
Describe how the variability of the distribution changes as the sample size increases.

User QWERTYL
by
3.3k points

1 Answer

12 votes
12 votes

Answer:

As the sample size increases, the variability decreases.

Explanation:

The actual departures from the mean are used to quantify variability. The variance would be lower the fewer the deviations.

By using the central limit theorem, we can determine that the sample mean for random samples of size n follows a normal distribution.

The standard deviation of the X bar will be
(s)/(√(n) ).

where s is the sample's variance's square root.

As a result, we discover that the standard deviation, or variability, is inversely proportional to the square root of the sample size. Therefore, standard error lowers as sample size grows. The variability falls off as sample size rises.

User Mitrakov Artem
by
2.8k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.