Final answer:
The term 'standard deviation' measures the variability of means, while 'range' measures the variability of individual data items. The answer is d. Standard Deviation - Range.
Step-by-step explanation:
The term standard deviation describes a measure of the variability for the different means. The range is a measure of the variability of individual data items. The correct answer is choice d: Standard Deviation - Range.
The standard deviation is a number that measures how far data values are from their mean, and is a key indicator of the spread or dispersion within a set of data. When the standard deviation is zero, it indicates no spread, i.e., all data values are the same. A small standard deviation means data is closely clustered around the mean, while a larger value indicates that the data are more spread out.
The range, on the other hand, is the difference between the largest and smallest data values in a set, which provides the simplest measure of variability. It indicates the extent to which data cover a spectrum of values, without giving indication about how the values are distributed around the mean.