91.9k views
2 votes
A researcher wants to estimate the mean number of miles on 5-year old Chevy Cavaliers. How many cars should be in a sample to estimate the mean number of miles within 400 miles with 95% confidence? Assume the population standard deviation is 1000 miles. Hint: You will need to use the margin of error part of the confidence interval).

User Rongon
by
8.2k points

1 Answer

3 votes

To calculate the sample size needed to estimate the mean number of miles within a certain margin of error, the z-score associated with the desired confidence level, the population standard deviation, and the desired margin of error need to be known. Here are the steps to calculate the sample size:

1. The confidence level is 95%. For a 95% confidence interval, the z-score (found in Z-table or can be calculated) is approximately 1.96.

2. The population standard deviation σ (sigma), is given as 1000 miles.

3. The desired margin of error (E) is given as 400 miles.

4. Use the formula for sample size: n = (Z * σ / E) ^2 , where 'n' is the sample size, 'Z' is the z-score for our confidence level, 'σ' is the population standard deviation, and 'E' is the margin of error.

Substituting the values into the formula generates:
n = (1.96 * 1000 / 400) ^ 2 = 24.01
Since we cannot have a fraction of a car, we will need to round up to the nearest whole number. The sample size required is 25.

Therefore, the researcher would need a size of 25 cars in the sample to be 95% confident that the mean number of miles is estimated within a margin of error of 400 miles.

User Vedant Shetty
by
7.7k points