73.4k views
1 vote
Suppose that a standard mass is measured 30 times with the same instrument and the calculated values of a = standard error of mean = 0.08 and s = standard deviation = 0.43 for a normal distribution. If the instrument is then used to measure an unknown mass and the reading is 105.6 kg, what is the error of the reading at 95% confidence?

User Matagus
by
7.6k points

1 Answer

4 votes

Final answer:

The error of the reading at 95% confidence is approximately 0.1568.

Step-by-step explanation:

To calculate the error of the reading at 95% confidence, we can use the formula:
Error = z * (standard error of mean)

Here, the standard error of mean is given as 0.08. To find z, we look up the corresponding value from the standard normal distribution table. At 95% confidence, the z-value is approximately 1.96.

Substituting the values into the formula, we get: Error = 1.96 * 0.08 = 0.1568

Therefore, the error of the reading at 95% confidence is approximately 0.1568.

User Sjmartin
by
8.6k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories