231k views
4 votes
A student wants to estimate the mean score of all college students for a particular exam. First use the range rule of thumb to make a rough estimate of the standard deviation of those scores. Possible scores range from 500 to 2200. Use technology and the estimated standard deviation to determine the sample size corresponding to a 95​% confidence level and a margin of error of 100 points. What​ isn't quite right with this​ exercise?

User Velval
by
5.3k points

1 Answer

3 votes

Answer:

70

Explanation:

it is given that score varies from 500 to 2200

so range =2200-500=1700

standard deviation
s=(range)/(4)=(1700)/(4)=425

Error E=100

Confidence level =95%=0.95

significance level α=1-0.95=0.05


z_{(\alpha )/(2)}=z_{(0.05)/(2)}=z_(0.025=1.96) from the z table

sample size
n\geq\left ( z_(\alpha )/(2)* (s)/(E) \right )^2


n\geq\left ( 1.96* (425)/(100) \right )^2


n\geq69.3889

so n will be 70

User Gary Riley
by
5.9k points