81.8k views
5 votes
Let X1 and X2 be independent random variables with mean μand variance σ².

Suppose that we have 2 estimators of μ:

θ₁^ = (X1+X2)/2
θ₂^ = (X1+3X2)/4

a) Are both estimators unbiased estimators ofμ?
b) What is the variance of each estimator?

1 Answer

3 votes

Answer:

a)
E(\hat \theta_1) =(1)/(2) [E(X_1) +E(X_2)]= (1)/(2) [\mu + \mu] = \mu

So then we conclude that
\hat \theta_1 is an unbiased estimator of
\mu


E(\hat \theta_2) =(1)/(4) [E(X_1) +3E(X_2)]= (1)/(4) [\mu + 3\mu] = \mu

So then we conclude that
\hat \theta_2 is an unbiased estimator of
\mu

b)
Var(\hat \theta_1) =(1)/(4) [\sigma^2 + \sigma^2 ] =(\sigma^2)/(2)


Var(\hat \theta_2) =(1)/(16) [\sigma^2 + 9\sigma^2 ] =(5\sigma^2)/(8)

Explanation:

For this case we know that we have two random variables:


X_1 , X_2 both with mean
\mu = \mu and variance
\sigma^2

And we define the following estimators:


\hat \theta_1 = (X_1 + X_2)/(2)


\hat \theta_2 = (X_1 + 3X_2)/(4)

Part a

In order to see if both estimators are unbiased we need to proof if the expected value of the estimators are equal to the real value of the parameter:


E(\hat \theta_i) = \mu , i = 1,2

So let's find the expected values for each estimator:


E(\hat \theta_1) = E((X_1 +X_2)/(2))

Using properties of expected value we have this:


E(\hat \theta_1) =(1)/(2) [E(X_1) +E(X_2)]= (1)/(2) [\mu + \mu] = \mu

So then we conclude that
\hat \theta_1 is an unbiased estimator of
\mu

For the second estimator we have:


E(\hat \theta_2) = E((X_1 + 3X_2)/(4))

Using properties of expected value we have this:


E(\hat \theta_2) =(1)/(4) [E(X_1) +3E(X_2)]= (1)/(4) [\mu + 3\mu] = \mu

So then we conclude that
\hat \theta_2 is an unbiased estimator of
\mu

Part b

For the variance we need to remember this property: If a is a constant and X a random variable then:


Var(aX) = a^2 Var(X)

For the first estimator we have:


Var(\hat \theta_1) = Var((X_1 +X_2)/(2))


Var(\hat \theta_1) =(1)/(4) Var(X_1 +X_2)=(1)/(4) [Var(X_1) + Var(X_2) + 2 Cov (X_1 , X_2)]

Since both random variables are independent we know that
Cov(X_1, X_2 ) = 0 so then we have:


Var(\hat \theta_1) =(1)/(4) [\sigma^2 + \sigma^2 ] =(\sigma^2)/(2)

For the second estimator we have:


Var(\hat \theta_2) = Var((X_1 +3X_2)/(4))


Var(\hat \theta_2) =(1)/(16) Var(X_1 +3X_2)=(1)/(4) [Var(X_1) + Var(3X_2) + 2 Cov (X_1 , 3X_2)]

Since both random variables are independent we know that
Cov(X_1, X_2 ) = 0 so then we have:


Var(\hat \theta_2) =(1)/(16) [\sigma^2 + 9\sigma^2 ] =(5\sigma^2)/(8)

User Yenssen
by
5.6k points