Quantities of Information - Joint Entropy

The joint entropy of two discrete random variables and is defined as the entropy of the joint distribution of and :

If and are independent, then the joint entropy is simply the sum of their individual entropies.

(Note: The joint entropy should not be confused with the cross entropy, despite similar notations.)

Read more about this topic:  Quantities Of Information

Famous quotes containing the words joint and/or entropy:

    I conjure thee, and all the oaths which I
    And thou have sworn to seal joint constancy,
    Here I unswear, and overswear them thus,
    Thou shalt not love by ways so dangerous.
    Temper, O fair Love, love’s impetuous rage,
    Be my true Mistress still, not my feign’d Page;
    I’ll go, and, by thy kind leave, leave behind
    Thee, only worthy to nurse in my mind
    Thirst to come back;
    John Donne (1572–1631)

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)