Final answer:
It is true that scaling a variable by a constant c multiplies the variance by c², because variance is calculated based on squared deviations from the mean. The expected value or mean is also multiplied by c, while the variance is multiplied by c².
Step-by-step explanation:
When you scale data in a dataset, you are multiplying each value by a constant c. If X represents the original variable and c is the scaling factor, then Xc would be the scaled variable. It’s true that scaling affects both the expected value (mean) and the variance, but there are specific details involved in what changes occur.
When you multiply a variable X by a constant c, the new expected value becomes c times the original expected value, but the new variance becomes c2 times the original variance. This is because variance is a measure of squared deviations from the mean, so when you scale the variable, you need to square the scaling factor to get the variance of the scaled variable.
If the original variable X has a variance of σ2, after scaling by a factor of c, the new variable Xc will have a variance of c2σ2. Therefore, the statement that scaling changes everything Xc and the variance by c2 is true. Additionally, the mean will also be affected and scaled by c, but that aspect is not part of the original statement.