Final answer:
Z-scores are used to identify outliers by indicating how many standard deviations a data point is from the mean. Scores less than -3 or greater than 3 are considered outliers, as they fall outside the range in which 99.7% of data is expected to lie according to the empirical rule.
Step-by-step explanation:
Z-scores can be used to identify outliers by measuring how far an individual data point is from the mean of the data set. A data point that has a z-score less than -3 or greater than 3 is typically considered an outlier. This is because under the standard normal distribution (Z ~ N(0, 1)), about 99.7 percent of the data should fall within three standard deviations from the mean. If a data point falls outside of this range, it's significantly different from most of the data.
According to the empirical rule, also known as the 68-95-99.7 rule, approximately 68 percent of the data falls within one standard deviation of the mean, 95 percent within two standard deviations, and 99.7 percent within three standard deviations. Therefore, getting a z-score greater than 3 or less than -3 is a clear indication that a data point is an outlier.
This method is particularly useful for comparing individual data points across different sets of data, where the means and standard deviations vary. For example, if John's GPA has a z-score of -0.21 and Ali's GPA has a z-score of -0.30, John's GPA is closer to the mean and likely higher than Ali's when compared within their respective schools.