QuantumInfo.ClassicalInfo.Entropy
14 declarations
One-event Shannon entropy
#H₁The one-event entropy function maps a probability to its Shannon entropy, defined as . Here, is an element of the unit interval , and by convention (inherited from the underlying `negMulLog` function), .
For the Shannon entropy function defined by , the value of the function at is equal to .
\( H_1(1) = 0 \)
#H₁_one_eq_zeroThe Shannon entropy function \( H_1 \) evaluated at the probability \( p = 1 \) is equal to \( 0 \).
for
#H₁_nonnegLet be a probability. The Shannon entropy of , denoted by , is non-negative, i.e., .
for all
#H₁_le_1For any probability in the unit interval , the Shannon entropy is strictly less than 1.
For any probability , the Shannon entropy satisfies the inequality , where is Euler's number.
Concavity of the Shannon Entropy
#H₁_concaveFor any probabilities and any weight , the Shannon entropy function is concave, satisfying the inequality: where denotes the convex combination .
Shannon entropy of a discrete distribution
#HₛLet be a finite set and be a probability distribution on . The Shannon entropy of the distribution is defined as the sum over all elements of the individual entropy contributions , where is the probability assigned to by . Mathematically, this is expressed as: where for .
The Shannon entropy is non-negative
#Hₛ_nonnegLet be a finite set and let be a probability distribution on . The Shannon entropy of the distribution, denoted by , is non-negative, i.e., .
Let be a finite set and be a probability distribution on . The Shannon entropy of the distribution, denoted , is less than or equal to the natural logarithm of the cardinality of , i.e., .
The Shannon entropy of a constant distribution is
#Hₛ_constant_eq_zeroLet be a type and be an element. Let be the constant probability distribution centered at (where if and otherwise). Then the Shannon entropy of this distribution is zero, i.e., .
Let be a non-empty finite set. For the uniform probability distribution on , the Shannon entropy is equal to the natural logarithm of the cardinality of , denoted .
Let be a probability. The Shannon entropy of a coin distribution (a two-event distribution) with success probability , denoted by , is equal to the binary entropy of , denoted by (or `Real.binEntropy p`).
Shannon entropy is invariant under reordering of probabilities
#Hₛ_eq_of_multiset_map_eqLet and be probability distributions on finite sets and , respectively. If the multiset of probabilities occurring in is equal to the multiset of probabilities occurring in , then the Shannon entropy of is equal to the Shannon entropy of , denoted .
