137k views
2 votes
If the standard deviation of a data set was originally 5, and if each value in the data set was multiplied by 4.8, what would be the standard deviation of the resulting data?

User Darkade
by
7.4k points

2 Answers

2 votes

Answer:

Explanation:

User Birsen
by
7.9k points
5 votes
The mean, standard deviation will be 4.8 larger. Therefore, 5 times 4.8 equals to 24.
User GuavaMantis
by
8.7k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories