In probability theory and information theory, the mutual information (sometimes known by the archaic term transinformation) of two random variables is a quantity that measures the mutual dependence of the two random variables. The most common unit of measurement of mutual information is the bit, when logarithms to the base 2 are used.
Read more about Mutual Information: Definition of Mutual Information, Relation To Other Quantities, Variations of Mutual Information, Applications of Mutual Information
Famous quotes containing the words mutual and/or information:
“What spectacle can be more edifying or more seasonable, than that of Liberty and Learning, each leaning on the other for their mutual and surest support?”
—James Madison (17511836)
“Computers are good at swift, accurate computation and at storing great masses of information. The brain, on the other hand, is not as efficient a number cruncher and its memory is often highly fallible; a basic inexactness is built into its design. The brains strong point is its flexibility. It is unsurpassed at making shrewd guesses and at grasping the total meaning of information presented to it.”
—Jeremy Campbell (b. 1931)