80.7k views
1 vote
Consider the simple linear regression model Yi=β0+β1xi+ϵi, where ϵi's are independent N(0,σ2) random variables. Therefore, Yi is a normal random variable with mean β0+β1xi and variance σ2. Moreover, Yi's are independent. As usual, we have the observed data pairs (x1,y1), (x2,y2), ⋯⋯, (xn,yn) from which we would like to estimate β0 and β1. In this chapter, we found the following estimators β1^=sxysxx,β0^=Y¯¯¯¯−β1^x¯¯¯. where sxx=∑i=1n(xi−x¯¯¯)2,sxy=∑i=1n(xi−x¯¯¯)(Yi−Y¯¯¯¯). Show that β1^ is a normal random variable. Show that β1^ is an unbiased estimator of β1, i.e., E[β1^]=β1. Show that Var(β1^)=σ2sxx.

User Ire
by
3.1k points

1 Answer

4 votes

Answer:

See proof below.

Explanation:

If we assume the following linear model:


y = \beta_o + \beta_1 X +\epsilon

And if we have n sets of paired observations
(x_i, y_i) , i =1,2,...,n the model can be written like this:


y_i = \beta_o +\beta_1 x_i + \epsilon_i , i =1,2,...,n

And using the least squares procedure gives to us the following least squares estimates
b_o for
\beta_o and
b_1 for
\beta_1 :


b_o = \bar y - b_1 \bar x


b_1 = (s_(xy))/(s_xx)

Where:


s_(xy) =\sum_(i=1)^n (x_i -\bar x) (y-\bar y)


s_(xx) =\sum_(i=1)^n (x_i -\bar x)^2

Then
\beta_1 is a random variable and the estimated value is
b_1. We can express this estimator like this:


b_1 = \sum_(i=1)^n a_i y_i

Where
a_i =((x_i -\bar x))/(s_(xx)) and if we see careful we notice that
\sum_(i=1)^n a_i =0 and
\sum_(i=1)^n a_i x_i =1

So then when we find the expected value we got:


E(b_1) = \sum_(i=1)^n a_i E(y_i)


E(b_1) = \sum_(i=1)^n a_i (\beta_o +\beta_1 x_i)


E(b_1) = \sum_(i=1)^n a_i \beta_o + \beta_1 a_i x_i


E(b_1) = \beta_1 \sum_(i=1)^n a_i x_i = \beta_1

And as we can see
b_1 is an unbiased estimator for
\beta_1

In order to find the variance for the estimator
b_1 we have this:


Var(b_1) = \sum_(i=1)^n a_i^2 Var(y_i) +\sum_i \sum_(j \\eq i) a_i a_j Cov (y_i, y_j)

And we can assume that
Cov(y_i,y_j) =0 since the observations are assumed independent, then we have this:


Var (b_1) =\sigma^2 (\sum_(i=1)^n (x_i -\bar x)^2)/(s^2_(xx))

And if we simplify we got:


Var(b_1) = (\sigma^2 s_(xx))/(s^2_(xx)) = (\sigma^2)/(s_(xx))

And with this we complete the proof required.

User Suresh Suthar
by
3.3k points