91.4k views
5 votes
A radar gun measured the speed of a baseball at 5 miles per hour. If the baseball was going 4.2 miles per hour, what was the percent error in this measurement?

User Laycat
by
6.5k points

1 Answer

2 votes

The percent error is a measure of how accurate a measurement is. It is calculated by taking the difference between the measured value and the true value, dividing that difference by the true value, and then multiplying by 100 to get a percentage.

In this case, the true value of the baseball's speed is 4.2 miles per hour and the measured value is 5 miles per hour. To find the percent error:

(5 - 4.2) / 4.2 * 100 = 0.190476 * 100 = 19.0476%

So the percent error in this measurement is 19.0476%. This means that the measurement was off by 19.0476% from the true value.

Alternatively, we can round it to 19% for ease of interpretation

Therefore, the percent error in this measurement is 19%.

User HenryR
by
7.5k points