Cross Entropy

In information theory, the cross entropy between two probability distributions measures the average number of bits needed to identify an event from a set of possibilities, if a coding scheme is used based on a given probability distribution, rather than the "true" distribution .

The cross entropy for two distributions and over the same probability space is thus defined as follows:

,

where is the entropy of, and is the Kullback-Leibler divergence of from (also known as the relative entropy).

For discrete and this means

The situation for continuous distributions is analogous:

NB: The notation is sometimes used for both the cross entropy as well as the joint entropy of and .

Read more about Cross Entropy:  Motivation, Estimation, Cross-entropy Minimization

Famous quotes containing the words cross and/or entropy:

    Flood-tide below me! I see you face to face!
    Clouds of the west—sun there half an hour
    high—I see you also face to face.
    Crowds of men and women attired in the usual costumes, how curious you are to me!
    On the ferry-boats the hundreds and hundreds that cross, returning
    home, are more curious to me than you suppose,
    And you that shall cross from shore to shore years hence are more to me, and more in my meditations, than you might suppose.
    Walt Whitman (1819–1892)

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)