118k views
3 votes
How do you use standard deviation?

User Bill Gary
by
6.6k points

2 Answers

3 votes
Work out the Mean (the simple average of the numbers)
Then for each number: subtract the Mean and square the result.
Then work out the mean of those squared differences.
Take the square root of that and we are done!
User Egos Zhang
by
6.9k points
4 votes

Standard deviation is like the average range values are away from the mean/median. So if you remember box and whisker plots, It is 2 boxes with 2 lines coming from them. Where the boxes meet is the mean/median depending on what you are trying to find, and the edges of the boxes are the normal distribution of values, and the lines go out, and that's the standard deviation. The values outside of those lines are outliers. A way you can think about it, is if let's say I had 3 numbers: 7,8,9. Now the mean/median is 8, and for this the standard deviation is 1 because most numbers that are not the median/mean are 1 value from the median/mean. It is a way of measuring an element of range. I hope this helped.

User Shankshera
by
7.1k points