95.3k views
25 votes
Jace measured a line to be 14 inches long. If the actual length of the line is

13.1 inches, then what was the percent error of the measurement, to the
nearest tenth of a percent?

User Migajek
by
3.2k points

1 Answer

11 votes

Answer:

6.87022901%

Explanation:

He measured 14 out of 13.1 inches. There is a difference of 0.9 inches when u subtract. He miscalculated by 0.9 out of 13.1. In a calculator you divide .9 by 13.1 you get 0.0687022901. Multiply by 100 to get the percent and round it to your desired decimal place.

User Wiseguy
by
3.2k points