230k views
2 votes
If the standard deviation of a data set were originally 12, and if each value in

the data set were multiplied by 1.75, what would be the standard deviation of
the resulting data?

User Go
by
5.3k points

2 Answers

2 votes

Answer:

It’s 21 (12 x 1.7) = 21

Explanation:

User MGwynne
by
4.8k points
6 votes

Answer: 21

Explanation:

Take 12 and multiply it by 1.75 (12x1.75) and you’ll get 21

User Jeenyus
by
4.5k points