203k views
0 votes
Discrete source S₁ has 4 equiprobable symbols while discrete source S₂ has 16 equiprobable symbols. When the entropy of these two sources is compared, the entropy of:

1. S₁ is greater than S₂
2. S₁ is less than S₂
3. S₁ is equal to S₂
4. Depends on rate of symbols/second

1 Answer

6 votes

Final answer:

The entropy (measure of unpredictability) of a source with 4 equiprobable symbols (S₁) is less than the entropy of a source with 16 equiprobable symbols (S₂), because entropy increases with the number of equiprobable microstates.

Step-by-step explanation:

The concept in question is related to entropy, which in information theory is a measure of the unpredictability or information content present in a source of messages. Given that discrete source S₁ has 4 equiprobable symbols and discrete source S₂ has 16 equiprobable symbols, the entropy can be compared to determine which source has greater unpredictability and therefore higher entropy.

To calculate entropy (S) in this context, the formula S = k log W is used, where k is a constant (Boltzmann's constant in thermodynamics but can be considered 1 for our purposes in information theory), log is the logarithm (base 2 for information entropy), and W is the number of possible microstates (equiprobable symbols). Thus, for S₁ we have S₁ = log₂ 4 = 2 bits and for S₂ we have S₂ = log₂ 16 = 4 bits. Clearly, the entropy of S₁ is less than the entropy of S₂.

User Mehmatrix
by
8.8k points