228k views
3 votes
A radar gun measured the speed of a baseball at 103 miles per hour. If the baseball was actually going 102.8 miles per hour, what was the percent error in this measurement?

User Rafique
by
7.0k points

2 Answers

3 votes

Answer:

0.924

Explanation:

Hope this helps you and have a great day

User Maelle
by
7.0k points
6 votes

Answer:

The percent error in this measurement is 0.194%.

Explanation:

Given : A radar gun measured the speed of a baseball at 103 miles per hour. If the baseball was actually going 102.8 miles per hour.

To find : What was the percent error in this measurement?

Solution :

The actual speed is 103 miles per hour.

The estimated speed is 102.8 miles per hour.

The percentage is given by,


\text{Percentage}=\frac{\text{Actual-Estimated}}{\text{Actual}}* 100


\text{Percentage}=(103-102.8)/(102.8)* 100


\text{Percentage}=(0.2)/(102.8)* 100


\text{Percentage}=0.194\%

Therefore, the percent error in this measurement is 0.194%.

User Arrowmaster
by
7.4k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.