162k views
4 votes
If a set of scores has a mean of 100.00 and

a standard deviation of 5.00, what is the
variance of the standard scores?

User Mindphaser
by
4.8k points

1 Answer

5 votes

Answer:

Variance is 25

Step-by-step explanation:

Recall that the standard deviation is defined as the square root of the variance. therefore, if you know the standard deviation
(\sigma), square it and you get the variance:


Variance= \sigma^2\\Variance= 5^2= 25

User David Barlow
by
4.6k points