Another useful measure of entropy that works equally well in the discrete and the continuous case is the relative entropy of a distribution. It is defined as the Kullback-Leibler divergence from the distribution to a reference measure m as follows. Assume that a probability distribution p is absolutely continuous with respect to a measure m, i.e. is of the form p(dx) = f(x)m(dx) for some non negative m-integrable function f with m-integral 1, then the relative entropy can be defined as
In this form the relative entropy generalises (up to change in sign) both the discrete entropy, where the measure m is the counting measure, and the differential entropy, where the measure m is the Lebesgue measure. If the measure m is itself a probability distribution, the relative entropy is non negative, and zero iff p = m as measures. It is defined for any measure space, hence coordinate independent and invariant under co-ordinate reparameterizations if one properly takes into account the transformation of the measure m. The relative entropy, and implicitly entropy and differential entropy, do depend on the "reference" measure m.
Read more about this topic: Entropy (information Theory)
Famous quotes containing the words relative and/or entropy:
“Three elements go to make up an idea. The first is its intrinsic quality as a feeling. The second is the energy with which it affects other ideas, an energy which is infinite in the here-and-nowness of immediate sensation, finite and relative in the recency of the past. The third element is the tendency of an idea to bring along other ideas with it.”
—Charles Sanders Peirce (18391914)
“Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.”
—Václav Havel (b. 1936)