Final answer:
Maximum likelihood estimators for β and σ² in a regression model through the origin are determined. β^ is the sum of X_iY_i divided by the sum of X_i², while σ² is estimated using SSE/(n-1). These estimators are found using the method of least squares with the assumption of normally distributed errors.
Step-by-step explanation:
The question is asking to determine the maximum likelihood estimators for β and σ2 in a simple linear regression model where the mean of Y when X=0 is 0. This would be a linear regression through the origin model. In this model, the regression equation is given by Yi = βXi + Ei, where Ei are independent and normally distributed with mean 0 and variance σ2.
To find the maximum likelihood estimate of β, we can use the method of least squares, since we know that the least squares estimator is also the maximum likelihood estimator when the errors are normally distributed. To do this, we minimize the sum of squared errors (SSE). The estimator β^ is then the sum of the product of X and Y divided by the sum of squared X values.
The variance σ2 can be estimated using the residual variance, which is the SSE divided by the degrees of freedom (n-1 since we have a regression through the origin model). The distribution of β^ can be found using the fact that it is a linear combination of normally distributed errors and hence also normally distributed. The estimator of σ2 follows a chi-squared distribution with (n-1) degrees of freedom, under the null hypothesis that there is no relationship between X and Y.
Therefore, the maximum likelihood estimator of β is β^ = Σ(XiYi)/Σ(Xi2) and the maximum likelihood estimator of σ2 is SSE/(n-1).