Final answer:
The slowest nonzero speed that the speedometer could read is 0.090 mi/h, which is the result of dividing the highest speed of 90.0 mi/h by the range factor of 10^3.
Step-by-step explanation:
The question involves calculating the slowest speed that a speedometer, which measures speeds over a range of 103, can display if the highest speed it reports is 90.0 mi/h. To find this, we divide the highest speed by the range factor. Doing so,
Slowest Speed = Highest Speed / Range Factor
Slowest Speed = 90.0 mi/h / 103
Slowest Speed = 0.090 mi/h
Therefore, the slowest nonzero speed the speedometer could read would be 0.090 mi/h, which corresponds to option a.