4.7k views
3 votes
Let X1 be a normal random variable with mean µ1 and variance σ 2 1 , and let X2 be a normal random variable with mean µ2 and variance σ 2 2 . Assuming that X1 and X2 are independent, what is the distribution of X1 + X2? g

User Steve Wang
by
5.6k points

1 Answer

1 vote

Answer:


X_1 + X_2 \sim (\mu_1 + \mu_2, \sigma_1^2 + \sigma_2^2 )

Explanation:

We are given the following in the question:


X_1 is a random normal variable with mean and variance


\mu_1\\\sigma_1^2


X_1 \sim N(\mu_1,\sigma_1^2)


X_2 is a random normal variable with mean and variance


\mu_2\\\sigma_2^2


X_2 \sim N(\mu_2,\sigma_2^2)


X_1, X_2 are independent events.

Let


Z =X_1 + X_2

Then, Z will have a normal distribution with mean equal to the sum of the two means and its variance equal the sum of the two variances.

Thus, we can write:


\mu = \mu_1 + \mu_2\\\sigma^2 = \sigma_1^2 + \sigma_2^2\\Z \sim (\mu_1 + \mu_2, \sigma_1^2 + \sigma_2^2 )

User Ashishmaurya
by
6.2k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.