74.8k views
1 vote
ASCII is a 7-bit code, while Unicode is a 16-bit code.
A. True
B. False

1 Answer

7 votes

Final answer:

ASCII is correctly noted as a 7-bit code, but it is inaccurate to state that Unicode is strictly a 16-bit code. Unicode has several encoding forms, including UTF-16, which is variable-width and can extend beyond 16 bits to accommodate additional characters.

Step-by-step explanation:

The statement that ASCII is a 7-bit code is true. The American Standard Code for Information Interchange (ASCII) uses 7 bits, allowing for 128 unique values which represent characters like letters, numbers, and punctuation marks. However, the statement that Unicode is a 16-bit code is false. Unicode is a character encoding standard that has multiple encoding forms: UTF-8, UTF-16, and UTF-32, among others. While UTF-16 uses 16 bits for each unit, it is not limited to just two bytes. It can use pairs of these units to create a 'surrogate pair', allowing it to represent more than 65,000 characters. Therefore, Unicode can represent a much wider range of characters from many different languages and types of symbols, and its encoding form is not restricted to 16 bits only.

User Nomie
by
7.6k points