100k views
3 votes
Let X1, X2, ... , Xn be a random sample from N(μ, σ2), where the mean θ = μ is such that −[infinity] < θ < [infinity] and σ2 is a known positive number. Show that the maximum likelihood estimator for θ is θ^ = X.

1 Answer

4 votes

Answer:


l'(\theta) = (1)/(\sigma^2) \sum_(i=1)^n (X_i -\theta)

And then the maximum occurs when
l'(\theta) = 0, and that is only satisfied if and only if:


\hat \theta = \bar X

Explanation:

For this case we have a random sample
X_1 ,X_2,...,X_n where
X_i \sim N(\mu=\theta, \sigma) where
\sigma is fixed. And we want to show that the maximum likehood estimator for
\theta = \bar X.

The first step is obtain the probability distribution function for the random variable X. For this case each
X_i , i=1,...n have the following density function:


f(x_i | \theta,\sigma^2) = (1)/(√(2\pi)\sigma) exp^{-((x-\theta)^2)/(2\sigma^2)} , -\infty \leq x \leq \infty

The likehood function is given by:


L(\theta) = \prod_(i=1)^n f(x_i)

Assuming independence between the random sample, and replacing the density function we have this:


L(\theta) = ((1)/(√(2\pi \sigma^2)))^n exp (-(1)/(2\sigma^2) \sum_(i=1)^n (X_i-\theta)^2)

Taking the natural log on btoh sides we got:


l(\theta) = -(n)/(2) ln(√(2\pi\sigma^2)) - (1)/(2\sigma^2) \sum_(i=1)^n (X_i -\theta)^2

Now if we take the derivate respect
\theta we will see this:


l'(\theta) = (1)/(\sigma^2) \sum_(i=1)^n (X_i -\theta)

And then the maximum occurs when
l'(\theta) = 0, and that is only satisfied if and only if:


\hat \theta = \bar X

User Navin
by
5.8k points