Quantities of Information - Joint Entropy

The joint entropy of two discrete random variables and is defined as the entropy of the joint distribution of and :

If and are independent, then the joint entropy is simply the sum of their individual entropies.

(Note: The joint entropy should not be confused with the cross entropy, despite similar notations.)

Read more about this topic:  Quantities Of Information

Famous quotes containing the words joint and/or entropy:

    No Government can be long secure without a formidable Opposition. It reduces their supporters to that tractable number which can be managed by the joint influences of fruition and hope. It offers vengeance to the discontented, and distinction to the ambitious; and employs the energies of aspiring spirits, who otherwise may prove traitors in a division or assassins in a debate.
    Benjamin Disraeli (1804–1881)

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)