187k views
4 votes
A set of data has a mean of 12 and a standard deviation of 3. A data point of the set has a z-score of 1.3. What does a z-score of 1.3 mean?

User Bensw
by
7.2k points

2 Answers

2 votes

Answer:

B. The data point is 1.3 standard deviations away from 12.

User KorHosik
by
8.0k points
4 votes
A z-score tells you how many standard deviations a point is from the mean.

z = (x - mean)/(std dev)
Working backwards with this formula,

x = z*std dev + mean \\ \\ x = 1.3*3 + 12 \\ \\ x = 15.9

So the data point is 15.9 and is 1.3 std deviations from the mean of 12.

Hope this helps.
User Onato
by
7.8k points