490,215 views
20 votes
20 votes
PLEASE HELP! I WILL GIVE 50 POINTS!!! A radar gun measured the speed of a baseball at 92 miles per hour. If the baseball was actually going 90.3 miles per hour, what was the percent error in this measurement?

User Dennis Anderson
by
2.9k points

2 Answers

5 votes
5 votes

1.883% error


| (actual - guess)/(actual) | \\ | (90.3 - 92)/(90.3) | = (1.7)/(90.3) = .0188 = .0188 * 100\% = 1.88\%

User Vishnu Kanwar
by
2.8k points
11 votes
11 votes

Answer:

Explanation:

A radar gun measured the speed of a baseball at 103 miles per hour. If the baseball was actually going 102.8 miles per hour.

User Susaj S N
by
3.0k points