138k views
1 vote
A source emits three symbols with probability p1 = 0.2, p2 = 0.7 and p3 = 0.1. Calculate Source entropy

1 Answer

4 votes

Final answer:

The source entropy is computed using the probabilities of emission for each symbol from the source and applying the formula: H = - (p1 * log2(p1) + p2 * log2(p2) + p3 * log2(p3) + ...).

Step-by-step explanation:

The subject of this question is Source entropy which is a concept from information theory, a branch of Mathematics and Computer Science. The grade level would likely be College as this topic is usually covered in higher-level courses related to information theory or data communication.

Entropy, in the context of information theory, is a measure of the uncertainty or unpredictability of a source's output. The formula for calculating entropy (H) of a source that emits a set of symbols with given probabilities (p1, p2, p3, ...) is given by:

  • H = - (p1 * log2(p1) + p2 * log2(p2) + p3 * log2(p3) + ...)

In this case, the probabilities are p1 = 0.2, p2 = 0.7, and p3 = 0.1. Applying the formula we get:

H = - (0.2 * log2(0.2) + 0.7 * log2(0.7) + 0.1 * log2(0.1))

After calculating the individual terms, you sum them up to get the total entropy for the source.

User Marc Fearby
by
8.1k points