73.4k views
4 votes
A radar measures the speed of a car at 45 miles per hour. The actual speed of the car is 40 miles per hour what is the percent error in the reading of the radar.

User Gauranga
by
4.5k points

2 Answers

4 votes
answer: 11.11%

Explanation:
the radar was off by 5 miles per hour, so we need to find what percent 5 is out of 45. percent of (times) if you don’t know how to find that out.
make the assumption that 45 is 100% because it is the output.
next we represent the value we seek with x. so 100% = 45 and x% = 5.
this gives us the simple pair of equations 100% = 45(1) and x% = 5(2).
then multiplying the equations together like 100%/x% = 45/5.
make it into its reciprocal so x%/100% = 5/45. that comes out to be x= 11.11%
5 is 11.11% of 45 so the radar is off by 11.11%
User Jianweichuah
by
4.2k points
4 votes

Answer:45

Explanation:

User Tekz
by
4.5k points