Quantities of Information - Joint Entropy

The joint entropy of two discrete random variables and is defined as the entropy of the joint distribution of and :

If and are independent, then the joint entropy is simply the sum of their individual entropies.

(Note: The joint entropy should not be confused with the cross entropy, despite similar notations.)

Read more about this topic:  Quantities Of Information

Famous quotes containing the words joint and/or entropy:

    Let me approach at least, and touch thy hand.
    [Samson:] Not for thy life, lest fierce remembrance wake
    My sudden rage to tear thee joint by joint.
    At distance I forgive thee, go with that;
    Bewail thy falsehood, and the pious works
    It hath brought forth to make thee memorable
    Among illustrious women, faithful wives:
    Cherish thy hast’n’d widowhood with the gold
    Of Matrimonial treason: so farewel.
    John Milton (1608–1674)

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)