Mutual Information

In probability theory and information theory, the mutual information (sometimes known by the archaic term transinformation) of two random variables is a quantity that measures the mutual dependence of the two random variables. The most common unit of measurement of mutual information is the bit, when logarithms to the base 2 are used.

Read more about Mutual Information:  Definition of Mutual Information, Relation To Other Quantities, Variations of Mutual Information, Applications of Mutual Information

Famous quotes containing the words mutual and/or information:

    I anticipate with pleasing expectations that retreat in which I promise myself to realize, without alloy, the sweet enjoyment of partaking, in the midst of my fellow citizens, the benign influence of good laws under a free government, the ever favorite object of my heart, and the happy reward, as I trust, of our mutual cares, labors, and dangers.
    George Washington (1732–1799)

    The family circle has widened. The worldpool of information fathered by the electric media—movies, Telstar, flight—far surpasses any possible influence mom and dad can now bring to bear. Character no longer is shaped by only two earnest, fumbling experts. Now all the world’s a sage.
    Marshall McLuhan (1911–1980)