52.0k views
4 votes
A sample of 100 independent random numbers is taken from this distribution, and its average is used to estimate the mean of the distribution. What isthe standard error of this estimate?

User Bgoosman
by
4.2k points

1 Answer

5 votes

Answer:

The standard error in estimating the mean = (0.1 × standard deviation of the distribution)

Explanation:

The standard error of the mean, for a sample, σₓ is related to the standard deviation, σ, through the relation

σₓ = σ/(√n)

n = sample size = 100

σₓ = σ/(√100)

σₓ = (σ/10) = 0.1σ

Hence, the standard error in estimating the mean = (0.1 × standard deviation of the distribution)

User Bhaumik Pandhi
by
4.0k points