17.3k views
2 votes
Develop a Matlab program that allows you to plot the entropy of a source with variable output probabilities. We wish to observe that the maximum source entropy does indeed occur when the source outputs are equally likely. Start with a simple two-output source [s1; s2] with respective probabilities [a; 1-a] and plot the entropy as a function of the parameter a. Then consider more complex cases such as a three output source [s1; s2; s3] with respective probabilities [a; b; 1 - a - b]. Be creative with the manner in which the results are displayed.

User Bezelinjah
by
4.9k points

2 Answers

4 votes

Final answer:

A Matlab program to plot source entropy for different output probabilities can be constructed using the concept of Shannon entropy, showing that maximum entropy occurs when all outputs have equal probability.

Step-by-step explanation:

To develop a Matlab program that plots the entropy of a source with variable output probabilities and demonstrate that the maximum source entropy occurs when the outputs are equally likely, you can use the concept of Shannon entropy. For a simple two-output source with probabilities [a; 1-a], the entropy H can be calculated using H = -a*log2(a) - (1-a)*log2(1-a). For a three-output source with probabilities [a; b; 1-a-b], the entropy formula can be extended to H = -a*log2(a) - b*log2(b) - (1-a-b)*log2(1-a-b). To visually represent this, you could plot entropy versus the probability parameters using Matlab.



The Matlab code can make use of the plot function for graphing, and 'for' loops or vector operations to calculate entropy over a range of probability values. Below is a pseudocode framework for such a program:




  • Define the range of probability values for a (and b in the 3-output case) over which you wish to calculate entropy.

  • For each value of a (and b), calculate the entropy using the Shannon formula.

  • Plot the resulting entropy values against a (and b).

  • Label the axes and add a title to the plot for clarity.



This graphical representation will show that as the probabilities approach equal values, the entropy reaches a maximum, consistent with the concept that disordered states with equal probabilities have greater entropy than more ordered states.

User Ahmad Samilo
by
5.3k points
3 votes

Answer:

8m equals to 7n

Step-by-step explanation:

User Zach Olivare
by
4.9k points