105k views
2 votes
A set of data has a high-value outlier. How do you expect the standard deviation to change when the outlier is removed? Would the result be different if the data had a low-value outlier instead? Explain.

2 Answers

1 vote

Answer:

Check each of the components below that you included in your response.

The standard deviation will decrease when the outlier is removed.

Standard deviation represents the spread of data from the mean.

Removing a high-value outlier decreases the spread of data from the mean.

Removing a low-value outlier decreases the spread of data from the mean.

In both cases the standard deviation decreases.

Explanation:

User GatesDA
by
6.7k points
4 votes
When you have an outlier that is higher then the bulk of the data, the standard deviation is larger then it would be without the outlier. If the outlier was lower than the bulk of data, then the standard deviation would again be larger then it would be without the outlier and the mean would go down when the outlier is removed. Where it differs is the fact that removing the outlier would result in an increase of the mean instead of a decrease.

In short, any high or low value outlier affect the data set the same way. When the outlier is removed, the standard deviation decreases.
User Krzysztof Boduch
by
6.7k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.