Entropic Uncertainty - Entropy Versus Variance Bounds

Entropy Versus Variance Bounds

The Gaussian or normal probability distribution plays an important role in the relationship between variance and entropy: it is a problem of the calculus of variations to show that this distribution maximizes entropy for a given variance, and at the same time minimizes the variance for a given entropy. In fact, for any probability density function φ on the real line, Shannon's entropy inequality specifies:

where H is the Shannon entropy and V is the variance, an inequality that is saturated only in the case of a normal distribution.

Moreover the Fourier transform of a Gaussian probability amplitude function is also Gaussian—and the absolute squares of both of these are Gaussian, too. This can then be used to derive the usual Robertson variance uncertainty inequality from the above entropic inequality, enabling the latter to be tighter than the former. That is (for ħ=1), exponentiating the Hirschman inequality and using Shannon's expression above,

Hirschman explained that entropy—his version of entropy was the negative of Shannon's—is a "measure of the concentration of in a set of small measure." Thus a low or large negative Shannon entropy means that a considerable mass of the probability distribution is confined to a set of small measure. Note that this set of small measure need not be contiguous; a probability distribution can have several concentrations of mass in intervals of small measure, and the entropy may still be low no matter how widely scattered those intervals are.

This is not the case with the variance: variance measures the concentration of mass about the mean of the distribution, and a low variance means that a considerable mass of the probability distribution is concentrated in a contiguous interval of small measure.

To formalize this distinction, we say that two probability density functions φ1 and φ2 are equimeasurable if:

where μ is the Lebesgue measure. Any two equimeasurable probability density functions have the same Shannon entropy, and in fact the same Rényi entropy, of any order. The same is not true of variance, however. Any probability density function has a radially decreasing equimeasurable "rearrangement" whose variance is less (up to translation) than any other rearrangement of the function; and there exist rearrangements of arbitrarily high variance, (all having the same entropy.)

Read more about this topic:  Entropic Uncertainty

Famous quotes containing the words entropy, variance and/or bounds:

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)

    There is an untroubled harmony in everything, a full consonance in nature; only in our illusory freedom do we feel at variance with it.
    Fyodor Tyutchev (1803–1873)

    How far men go for the material of their houses! The inhabitants of the most civilized cities, in all ages, send into far, primitive forests, beyond the bounds of their civilization, where the moose and bear and savage dwell, for their pine boards for ordinary use. And, on the other hand, the savage soon receives from cities iron arrow-points, hatchets, and guns, to point his savageness with.
    Henry David Thoreau (1817–1862)