Final answer:
The mean vector E[X] for the set of independent Gaussian random variables is a vector of repeated μ values. The covariance matrix ΣX is a diagonal matrix with each diagonal element equal to the variance σ², and the off-diagonal elements are zero.
Step-by-step explanation:
The question involves finding the mean vector and covariance matrix for a set of independent Gaussian random variables X1, X2, ..., Xn. Since these variables are independent and identically distributed (i.i.d.), with each having an expected value (mean) μ and variance σ2, we can directly apply the definitions for mean vector and covariance matrix.
The mean vector E[X] of the random variables is simply a vector of their expected values, hence:
The covariance matrix ΣX is a diagonal matrix since the variables are independent:
- ΣX = diag(σ2, σ2, ..., σ2)
Thus, each diagonal element of ΣX is the variance of the respective random variable, and the off-diagonal elements are zero, reflecting their independence.