Maximization in The Normal Distribution
With a normal distribution, differential entropy is maximized for a given variance. The following is a proof that a Gaussian variable has the largest entropy amongst all random variables of equal variance.
Let g(x) be a Gaussian PDF with mean μ and variance σ2 and f(x) an arbitrary PDF with the same variance. Since differential entropy is translation invariant we can assume that f(x) has the same mean of μ as g(x).
Consider the Kullback-Leibler divergence between the two distributions
Now note that
because the result does not depend on f(x) other than through the variance. Combining the two results yields
with equality when g(x) = f(x) following from the properties of Kullback-Leibler divergence.
This result may also be demonstrated using the variational calculus. A Lagrangian function with two Lagrangian multipliers may be defined as:
where g(x) is some function with mean μ. When the entropy of g(x) is at a maximum and the constraint equations, which consist of the normalization condition and the requirement of fixed variance, are both satisfied, then a small variation δg(x) about g(x) will produce a variation δL about L which is equal to zero:
Since this must hold for any small δg(x), the term in brackets must be zero, and solving for g(x) yields:
Using the constraint equations to solve for λ0 and λ yields the normal distribution:
Read more about this topic: Differential Entropy
Famous quotes containing the words normal and/or distribution:
“The word career is a divisive word. Its a word that divides the normal life from business or professional life.”
—Grace Paley (b. 1922)
“My topic for Army reunions ... this summer: How to prepare for war in time of peace. Not by fortifications, by navies, or by standing armies. But by policies which will add to the happiness and the comfort of all our people and which will tend to the distribution of intelligence [and] wealth equally among all. Our strength is a contented and intelligent community.”
—Rutherford Birchard Hayes (18221893)