Entropy (information Theory) - Relative Entropy

Another useful measure of entropy that works equally well in the discrete and the continuous case is the relative entropy of a distribution. It is defined as the Kullback-Leibler divergence from the distribution to a reference measure m as follows. Assume that a probability distribution p is absolutely continuous with respect to a measure m, i.e. is of the form p(dx) = f(x)m(dx) for some non negative m-integrable function f with m-integral 1, then the relative entropy can be defined as

In this form the relative entropy generalises (up to change in sign) both the discrete entropy, where the measure m is the counting measure, and the differential entropy, where the measure m is the Lebesgue measure. If the measure m is itself a probability distribution, the relative entropy is non negative, and zero iff p = m as measures. It is defined for any measure space, hence coordinate independent and invariant under co-ordinate reparameterizations if one properly takes into account the transformation of the measure m. The relative entropy, and implicitly entropy and differential entropy, do depend on the "reference" measure m.

Read more about this topic:  Entropy (information Theory)

Famous quotes containing the words relative and/or entropy:

    The ungentlemanly expressions and gasconading conduct of yours relative to me yesterday was in true character of yourself and unmask you to the world and plainly show that they were ebullitions of a base mind ... and flow from a source devoid of every refined sentiment or delicate sensations.
    Andrew Jackson (1767–1845)

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)