115k views
3 votes
If the standard deviation of a data set was originally 5, and if each value in the data set was multiplied by 3.6, what would be the standard deviation of the resulting data?

User Horseyguy
by
8.5k points

2 Answers

4 votes
You would take 5 and multiply it by 3.6 which gives you 18. I believe that is how you would do that problem. Hope it helped.
User Thom
by
8.7k points
4 votes

Answer:

New Standard Deviation = 18

Explanation:


\text{Standard Deviation for sample is given by : }s=\sqrt{(\sum_(i=1) ^(n) (x_i-\bar x))/(n-1)

When each value is multiplied by 3.6 then there would be no effect on n as the number of terms will be same.

But the mean will be effected.


\implies \bar{x}=(\sum_(i=1)^(n) x_i)/(n)\\\\\implies \text{New mean = }\frac {\sum_(i=1) ^(n) 3.6* x_i}{n}\\\\\text{Also, each term }x_i\text{ is multiplied by 3.6}\\\\\implies\text{New terms = }3.6* x_i

So, the New standard deviation now becomes :


s'=\sqrt{(\sum_(i=1) ^(n) (3.6* x_i-3.6* \bar x))/(n-1)}\\\\\implies s'= 3.6* \sqrt{(\sum_(i=1) ^(n) (x_i-\bar x))/(n-1)}\\\\\implies s'=3.6* s\\\\ \implies s'=3.6* 5\\\\\implies\textbf{New Standard Deviation = }\bf 18

User Feugy
by
8.1k points