Final answer:
C. A byte, consisting of 8 bits, is the standard unit for the size of a character in computing, different from a nibble (4 bits) or a word, which varies in size based on the CPU architecture.
Step-by-step explanation:
When discussing the size of a character in computing, the correct term to use is byte. A byte is composed of 8 bits and serves as the standard unit for representing an alphanumeric character or symbol within a computer system. Unlike a nibble, which includes only 4 bits, or a bit, which is the most basic unit of data in computing representing a binary value of either 0 or 1, a byte has sufficient capacity to encode a single character. Additionally, a word refers to the data size that a machine's CPU is designed to handle efficiently and varies in size—common word sizes include 16, 32, or 64 bits depending on the architecture. However, when specifically referring to the size of a character, a byte is the standard unit of measure.
A byte is the size of a character in computing, consisting of 8 bits, and commonly used to represent an alphanumeric character or symbol.
In computing terminology, a byte consists of 8 bits and can represent a single alphanumeric character, symbol, or control code.
While a nibble is made up of 4 bits and a word is architecture-dependent, typically consisting of 16, 32, or 64 bits, it's common practice to use bytes when referring to the size of a character.