20.9k views
4 votes
How many bits do I need at a minimum to represent a 7-digit decimal number? Explain your answers!

a. (5%) using packed decimal?
b. (5%) Using two’s complement?
c. (5%) Using unsigned binary?

User Jon Sud
by
8.1k points

1 Answer

1 vote

Final answer:

A minimum of 32 bits are required to represent a 7-digit decimal number using packed decimal, 26 bits using two's complement, and 25 bits using unsigned binary.

Step-by-step explanation:

When representing a 7-digit decimal number, the number of bits needed varies according to the encoding method used:

  • Packed Decimal: This way of encoding stores two decimal digits per byte, with the last nibble used for the sign. Thus, for a 7-digit number, you'd need (7/2) = 3.5 bytes, but since we can't have half a byte, we round up to 4 bytes, resulting in 32 bits minimum needed.
  • Two’s Complement: To represent a signed value, the number of bits is one more than the number needed for the maximum unsigned value. A 7-digit maximum decimal is 9,999,999 and in binary, it would be 1001100010010111100111101, which is 25 bits, so for two's complement, we need 26 bits.
  • Unsigned Binary: For unsigned binary, we only need to consider the maximum value of 9,999,999, which is the same 25 bits as the two's complement without the need for an extra bit for the sign.

To summarize, minimum bits required for different encoding schemes:

  1. Packed Decimal: 32 bits
  2. Two’s Complement: 26 bits
  3. Unsigned Binary: 25 bits

User Dzejkob
by
7.4k points