Final answer:
To calculate information in bits for events with non-equal probabilities, take the logarithm of the probabilities using base 2. This is due to the relationship between probability and information in information theory.
Step-by-step explanation:
To calculate the information (in bits) for events with non-equal probabilities, you use the formula from information theory which involves taking the logarithm of the probabilities. Specifically, the amount of information I for an event with probability p is I = -log2(p) bits. If an event has a non-equal or unequal probability, the information content of this event is higher than that of an event which is certain to happen since uncertainty is higher.
For example, if an event has a probability of 0.5 (like flipping a fair coin to heads), then the information content I is calculated as I = -log2(0.5) which equals 1 bit. This is because there are two equally likely outcomes, and it takes 1 bit of information to encode a choice between two equally likely possibilities.