Information Entropy
Information EntropyA quantity suitable for measuring the amount of information, proposed by Shannon in 1948. It borrowed the concept of entropy in thermodynamics, called the average amount of information after excluding redundancy in information information entropy, and gave the relevant mathematical expression.
Three properties of information entropy
- Monotonicity: The higher the probability of an event, the lower the information entropy it carries. An extreme example is "the sun rises from the east", which is a deterministic event and therefore does not carry any information. From the perspective of information theory, this sentence can be considered to have no uncertainty.
- Non-negativity: Information entropy cannot be negative. Negative information entropy means that after learning a certain information, its uncertainty will increase, which is illogical.
- Additivity: The total uncertainty measure of multiple random events occurring simultaneously can be expressed as the sum of the uncertainty measures of each event.