Differential entropy
The differential entropy of a continuous random variable
It is the continuous analogue of Shannon entropy for discrete distributions,
Characteristics:
- Scale dependence:
Shannon entropyis always nonnegative and invariant under relabeling of outcomes. By contrast, can be negative and depends on the units of . If we rescale , then
- Discretization link:
If a continuous random variableis discretized into bins of size , then the Shannon entropy of the discretized distribution satisfies
Thus, differential entropy is not absolute: it only gains physical meaning when a fundamental resolution scale is specified.
Planck Length as a Natural Cutoff
In physics, the Planck length
is often regarded as the minimal meaningful length scale. When interpreting entropy in continuous systems (fields, spacetime degrees of freedom, black holes), the bin size
This gives a bridge between differential entropy and a physically grounded discrete Shannon entropy:
- Without such a cutoff,
remains ambiguous. - With
as the "ultimate resolution," entropy becomes a dimensionless, absolute quantity. - This perspective is important in quantum gravity and black hole thermodynamics, where entropy counts microscopic states per Planck-scale cell of phase space or spacetime.
Gibbs entropy
Consider a system in Classical Statistical Mechanics with ensemble given by
The statistical definition of entropy, known as the Gibbs entropy, is
Important caution:
- In the continuous case, the numerical value of
depends on the coordinates used and on the units of . - Unlike the discrete case,
does not imply determinism. For example, a uniform distribution on has (in natural log units), yet it represents maximum uncertainty given that support. - A completely deterministic macrostate (Dirac delta distribution) has
in this formula, not zero.
In some sense,
Important relation
Consider a system in thermal equilibrium with a bath at temperature
- Express Entropy using the partition function (
)
Substitutinginto the entropy definition: Taking the average: Plugging into : With : - Take the Differential
For fixed Hamiltonian parameters:Hence: Substituting :
Related: first law of thermodynamics. |
---|
Boltzmann entropy
See harmonic oscillator in CSM#5) Entropy
Maximum Entropy Distributions
- Constraint: only mean fixed.
- On
, the maximum entropy distribution is the exponential distribution (Boltzmann distribution):
- On
-
On
, there is no maximum entropy distribution with only the mean fixed (entropy can be made arbitrarily large). -
Constraint: mean and variance fixed.
- On
, the maximum entropy distribution is the Gaussian distribution:
- On
Derivation sketch (Lagrange multipliers)
Maximize Shannon entropy
subject to normalization and the moment constraints. The variational problem
yields (after variation)
- If only the mean is constrained:
, giving an exponential distribution (normalizable only on ). - If mean and variance are constrained:
, giving a Gaussian.
In general, maximum entropy solutions belong to exponential families, with form determined by the active constraints.