59,357 views
32 votes
32 votes
7. Classical linear model assumptions for time series Consider the following stochastic process {(x1, x2, x3, . . . , xk, yt): t = 1, 2, . . . , n}{(x1, x2, x3, . . . , xk, yt): t = 1, 2, . . . , n} that follows the linear model: y=β0+β1xt1+β2xt2+β3xt3 + . . . +xtk+uy=β0+β1xt1+β2xt2+β3xt3 + . . . +xtk+u {(ut: t =1, 2, . . . , n)}{(ut: t =1, 2, . . . , n)} = sequence of error terms nn = number of observations (time periods) What are the minimum Gauss–Markov assumptions needed for the OLS estimates of βˆjβ^j , for j = 1, 2, . . . , kj = 1, 2, . . . , k , to be the best linear unbiased estimators (BLUE) conditional on the explanatory variables for all time periods ( XX )? Check all that apply.

User Kassandra
by
3.3k points

1 Answer

8 votes
8 votes

Answer:

Options are missing.

The options for the above question are:

TS.1: Linear in parameters.

TS.2:No perfect collinearity

TS.3: Zero conditional mean.

TS.4: Homoskedasticity.

TS.5: No serial correlation

TS.6: Normality.

Hence the correct answer is TS1 to TS 5

Explanation:

Assumptions TS 1 to TS 5 are the minimum set of assumptions needed to for the OLS estimates to be the best linear unbiased estimators conditional on explanatory variables for all time periods.

The assumptions of Normality is not needed for the estimators to show the BLUE property

User Hoonoh
by
3.3k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.