137k views
2 votes
If the standard deviation of a data set was originally 5, and if each value in the data set was multiplied by 4.8, what would be the standard deviation of the resulting data?

User Darkade
by
5.6k points

2 Answers

2 votes

Answer:

Explanation:

User Birsen
by
5.7k points
5 votes
The mean, standard deviation will be 4.8 larger. Therefore, 5 times 4.8 equals to 24.
User GuavaMantis
by
6.4k points