11.6k views
1 vote
Consider a source which produces an IID sequence of symbols from the alphabet {A,B,C} with probabilities {0.5,0.25,0.25} respectively. (a) What is the entropy of the source? (b) Find binary Huffman codes for taking one source symbols at a time. What is the the average number of binary code symbols per source symbol? (c) Find binary Huffman codes for taking two source symbols at a time, i.e., {AA,AB,AC,BA,BB,BC,CA,CB,CC}. What is the the average number of binary code symbols per source symbol?

1 Answer

3 votes

Final answer:

The entropy of the source can be calculated using the given probabilities. Binary Huffman codes can be found for one and two source symbols at a time, and the average number of binary code symbols per source symbol can be calculated.

Step-by-step explanation:

(a) The entropy of a source can be calculated using the formula:

H(X) = - Sum P(x) * log2 P(x)

where P(x) is the probability of each symbol, and log2 is the logarithm base 2. Using the given probabilities, the entropy can be calculated as:

H(X) = - (0.5 * log2(0.5) + 0.25 * log2(0.25) + 0.25 * log2(0.25))

(b) To find the binary Huffman codes for one source symbol at a time, we can construct a binary tree. The symbols with higher probabilities will have shorter codes. The average number of binary code symbols per source symbol can be calculated by multiplying the probability of each symbol with the length of its binary code, and summing up these values.

(c) To find the binary Huffman codes for two source symbols at a time, we need to combine the symbols and their probabilities. We can construct a binary tree with the combined probabilities and assign binary codes accordingly. The average number of binary code symbols per source symbol can be calculated similarly as in part (b).

User Dejan Peretin
by
7.2k points