214k views
2 votes
If each value in a data set is multiplied by a constant k, what would happen to the standard deviation? A) The standard deviation would remain the same. B) The standard deviation would be multiplied by k. C) The standard deviation would decrease by k as well. D) There is not enough information to know what would happen.

User Awena
by
3.7k points

1 Answer

2 votes

Answer: B, it will be multiplied

Explanation:

I just did the USATestprep

User BogdanCsn
by
4.6k points