90.4k views
0 votes
A source emits three symbols with probability p1=0.2,p2=0.7 and p3=0.1. Calculate Source entropy

User Feryt
by
7.5k points

1 Answer

0 votes

Final answer:

To find the source entropy for three symbols with probabilities of 0.2, 0.7, and 0.1, use the entropy formula H = - ∑ p(i) × log2(p(i)). The calculated source entropy is approximately 1.15677 bits.

Step-by-step explanation:

To calculate the source entropy when a source emits three symbols with probability p1 = 0.2, p2 = 0.7, and p3 = 0.1, we use the formula for the entropy of a source, which is given by:

H = - ∑ p(i) × log2(p(i))

Where p(i) represents the probability of each symbol emitted by the source and the summation (∑) is over all symbols.

Applying this formula:

H = - (p1 × log2(p1) + p2 × log2(p2) + p3 × log2(p3))

H = - (0.2 × log2(0.2) + 0.7 × log2(0.7) + 0.1 × log2(0.1))

Calculating the values we get:

H = - (0.2 × (-2.32193) + 0.7 × (-0.51457) + 0.1 × (-3.32193))

H = - (-0.46439 - 0.36019 - 0.33219)

H = 1.15677 bits

Therefore, the entropy of the source is approximately 1.15677 bits.

User Ollins
by
7.9k points