168k views
4 votes
A DMS X has four symbols x1, x2, x3 and x4 with probabilities P(x1) = 0.4, P(x2) = 0.3, P(x3) = 0.2, P(x4) = 0.1.

4.2.1 Calculate H(x).
4.2.2 Find the information contained in the messages.
(i) x1 x2 x3 x4
(ii) x4 x3 x3 x2

User Teresita
by
8.7k points

1 Answer

4 votes

Answer:

4.2.1 To calculate H(x), we need to use the formula for entropy:

H(x) = -∑ P(xi) * log2(P(xi))

Given the probabilities for the symbols x1, x2, x3, and x4 as follows:

P(x1) = 0.4

P(x2) = 0.3

P(x3) = 0.2

P(x4) = 0.1

Let's calculate H(x):

H(x) = - (P(x1) * log2(P(x1)) + P(x2) * log2(P(x2)) + P(x3) * log2(P(x3)) + P(x4) * log2(P(x4)))

H(x) = - (0.4 * log2(0.4) + 0.3 * log2(0.3) + 0.2 * log2(0.2) + 0.1 * log2(0.1))

Calculating each term:

0.4 * log2(0.4) = 0.4 * (-1.3219) = -0.5288

0.3 * log2(0.3) = 0.3 * (-1.737) = -0.5211

0.2 * log2(0.2) = 0.2 * (-2.3219) = -0.4644

0.1 * log2(0.1) = 0.1 * (-3.3219) = -0.3322

H(x) = - (-0.5288 + (-0.5211) + (-0.4644) + (-0.3322))

H(x) = - (-1.8465)

H(x) = 1.8465

Therefore, H(x) = 1.8465.

4.2.2 To find the information contained in the messages, we can use the formula:

I(xi) = -log2(P(xi))

(i) For the message x1 x2 x3 x4:

I(x1) = -log2(P(x1)) = -log2(0.4) = 1.3219

I(x2) = -log2(P(x2)) = -log2(0.3) = 1.737

I(x3) = -log2(P(x3)) = -log2(0.2) = 2.3219

I(x4) = -log2(P(x4)) = -log2(0.1) = 3.3219

Therefore, the information contained in the message x1 x2 x3 x4 is:

I(x1 x2 x3 x4) = I(x1) + I(x2) + I(x3) + I(x4) = 1.3219 + 1.737 + 2.3219 + 3.3219 = 8.7027

(ii) For the message x4 x3 x3 x2:

I(x4) = -log2(P(x4)) = -log2(0.1) = 3.3219

I(x3) = -log2(P(x3)) = -log2(0.2) = 2.3219

I(x3) = -log2(P(x3)) = -log2(0.2) = 2.3219

I(x2) = -log2(P(x2)) = -log2(0.3) = 1.737

Therefore, the information contained in the message x4

User Toolkit
by
8.0k points