361,998 views
45 votes
45 votes
A radar gun measured the speed of a baseball at 103 miles per hour. If the baseball was actually going 102.8 miles per hour, what was the percent error in this measurement? Round to the nearest hundredth percent

User Sdellysse
by
3.6k points

1 Answer

10 votes
10 votes

Answer

Percent Error = 0.195%

Step-by-step explanation

Percent error is given as


\text{Percent Error = }\frac{Error}{True\text{ value}}*100\text{ percent}

Error = | (Incorrect value) - (True value) |

Incorrect value = 103 miles per hour

True value = 102.8 miles per hour

Error = | (Incorrect value) - (True value) |

Error = | 103 - 102.8 |

Error = 0.2 miles per hour


\begin{gathered} \text{Percent Error = }\frac{Error}{True\text{ value}}*100\text{ percent} \\ \text{Percent Error = }(0.2)/(102.8)*100\text{ percent} \\ \text{Percent Error = 0.195 percent} \end{gathered}

Hope this Helps!!!