In probability theory and information theory, the mutual information (sometimes known by the archaic term transinformation) of two random variables is a quantity that measures the mutual dependence of the two random variables. The most common unit of measurement of mutual information is the bit, when logarithms to the base 2 are used.
Read more about Mutual Information: Definition of Mutual Information, Relation To Other Quantities, Variations of Mutual Information, Applications of Mutual Information
Famous quotes containing the words mutual and/or information:
“Of course I lie to people. But I lie altruisticallyfor our mutual good. The lie is the basic building block of good manners. That may seem mildly shocking to a moralistbut then what isnt?”
—Quentin Crisp (b. 1908)
“I am the very pattern of a modern Major-Gineral,
Ive information vegetable, animal, and mineral;
I know the kings of England, and I quote the fights historical,
From Marathon to Waterloo, in order categorical;”
—Sir William Schwenck Gilbert (18361911)