153,732 views
12 votes
12 votes
3. Chase got a lab report back with a grade of .75 written in red on it. He had though his lab grade would have been .95.

What was Chase's percent error in determining his own grade properly?

User Alexander Nied
by
2.8k points

1 Answer

20 votes
20 votes

Final answer:

To find Chase's percent error in determining his own grade properly, subtract his observed grade from the expected grade, take the absolute value, divide by the expected grade, and multiply by 100%.

Step-by-step explanation:

To find Chase's percent error in determining his own grade properly, we need to use the percent error formula:

Percent Error = (|Observed Value - Actual Value| / Actual Value) * 100%

In this case, Chase's observed grade was 0.75 and his expected grade was 0.95. Plugging these values into the formula:

Percent Error = (|0.75 - 0.95| / 0.95) * 100% = (0.2 / 0.95) * 100% = 21.05%

Therefore, Chase's percent error in determining his own grade properly is 21.05%.

User Hzdbyte
by
3.1k points