Answer: C. Using a fixed but large number of bits, for example 128, eliminates the possibility of overflow errors.
Step-by-step explanation:
Computer as an intelligent device can only be able to represent inputs through conversion to readable machine language before it becomes output. In order to do this, there is a need to convert those inputs into bits and bytes. The converted input is then brought out as a readable format on the computer screen.