Final answer:
The variance of X-Y, var[X-Y], can be shown to be var[X] + var[Y] - 2cov[X,Y] by expanding the expectation of the square of (X-Y) into its component variances and covariances, which reveals that covariance is subtracted due to the subtraction operation between the variables.
Step-by-step explanation:
To show that the variance of the difference between two random variables X and Y, denoted as var[X−Y], is equal to var[X] + var[Y] − 2cov[X,Y], we start by looking at the definition of variance.
The variance of a random variable is the mean of the squared deviations from the mean, represented as var[X] = Σ(x - μ)2P(x), where μ is the mean of X, x represents values of X, and P(x) represents their respective probabilities.
When combining two random variables, the property of the covariance term, cov[X,Y], comes into play. This term measures how much two random variables change together. If X and Y are independent, their covariance is zero. In the case of calculating the variance of the difference of X and Y, the covariance will be subtracted rather than added because the operation between variables is subtraction, not addition.
So, var[X−Y] is calculated as follows:
- Using linearity of expectation, we write var[X−Y] = E[(X - Y)2].
- Expand the square to var[X−Y] = E[X2] - 2E[XY] + E[Y2].
- Then, express each expected value in terms of variance and covariance: var[X−Y] = var[X] + var[Y] − 2cov[X,Y].
This formula indicates that, just as in the addition of the two random variables, variances are always summed but the covariance term changes sign depending on whether we are dealing with X + Y or X - Y.