Mutual Information

In probability theory and information theory, the mutual information (sometimes known by the archaic term transinformation) of two random variables is a quantity that measures the mutual dependence of the two random variables. The most common unit of measurement of mutual information is the bit, when logarithms to the base 2 are used.

Read more about Mutual Information:  Definition of Mutual Information, Relation To Other Quantities, Variations of Mutual Information, Applications of Mutual Information

Famous quotes containing the words mutual and/or information:

    Whoever takes a view of the life of man ... will find it so beset and hemm’d in with obligations of one kind or other, as to leave little room to suspect, that man can live to himself: and so closely has our creator link’d us together ... that we find this bond of mutual dependence ... is too strong to be broke.
    Laurence Sterne (1713–1768)

    I have all my life been on my guard against the information conveyed by the sense of hearing—it being one of my earliest observations, the universal inclination of humankind is to be led by the ears, and I am sometimes apt to imagine that they are given to men as they are to pitchers, purposely that they may be carried about by them.
    Mary Wortley, Lady Montagu (1689–1762)