Mutual Information

In probability theory and information theory, the mutual information (sometimes known by the archaic term transinformation) of two random variables is a quantity that measures the mutual dependence of the two random variables. The most common unit of measurement of mutual information is the bit, when logarithms to the base 2 are used.

Read more about Mutual Information:  Definition of Mutual Information, Relation To Other Quantities, Variations of Mutual Information, Applications of Mutual Information

Famous quotes containing the words mutual and/or information:

    I describe family values as responsibility towards others, increase of tolerance, compromise, support, flexibility. And essentially the things I call the silent song of life—the continuous process of mutual accommodation without which life is impossible.
    Salvador Minuchin (20th century)

    Rejecting all organs of information ... but my senses, I rid myself of the Pyrrhonisms with which an indulgence in speculations hyperphysical and antiphysical so uselessly occupy and disquiet the mind.
    Thomas Jefferson (1743–1826)