91,691 views
35 votes
35 votes
If each value in a data set is multiplied by a constant k, what would happen to the standard deviation? A) The standard deviation would remain the same. B) The standard deviation would be multiplied by k. C) The standard deviation would decrease by k as well. D) There is not enough information to know what would happen.

User Selle
by
2.7k points

1 Answer

14 votes
14 votes

Answer: B, it will be multiplied

Explanation:

I just did the USATestprep

User AHHP
by
3.0k points