Mutual Information

In probability theory and information theory, the mutual information (sometimes known by the archaic term transinformation) of two random variables is a quantity that measures the mutual dependence of the two random variables. The most common unit of measurement of mutual information is the bit, when logarithms to the base 2 are used.

Read more about Mutual Information:  Definition of Mutual Information, Relation To Other Quantities, Variations of Mutual Information, Applications of Mutual Information

Famous quotes containing the words mutual and/or information:

    If one considers how much reason every person has for anxiety and timid self-concealment, and how three-quarters of his energy and goodwill can be paralyzed and made unfruitful by it, one has to be very grateful to fashion, insofar as it sets that three-quarters free and communicates self-confidence and mutual cheerful agreeableness to those who know they are subject to its law.
    Friedrich Nietzsche (1844–1900)

    I am the very pattern of a modern Major-Gineral,
    I’ve information vegetable, animal, and mineral;
    I know the kings of England, and I quote the fights historical,
    From Marathon to Waterloo, in order categorical;
    Sir William Schwenck Gilbert (1836–1911)