56.3k views
0 votes
PLEASE HELP! I WILL GIVE 50 POINTS!!! A radar gun measured the speed of a baseball at 92 miles per hour. If the baseball was actually going 90.3 miles per hour, what was the percent error in this measurement?

User Sachi
by
4.6k points

2 Answers

6 votes

1.883% error


| (actual - guess)/(actual) | \\ | (90.3 - 92)/(90.3) | = (1.7)/(90.3) = .0188 = .0188 * 100\% = 1.88\%

User Gawbul
by
4.3k points
1 vote

Answer:

Explanation:

A radar gun measured the speed of a baseball at 103 miles per hour. If the baseball was actually going 102.8 miles per hour.

User Saurabh Kumar
by
4.7k points