In probability theory and information theory, the mutual information (sometimes known by the archaic term transinformation) of two random variables is a quantity that measures the mutual dependence of the two random variables. The most common unit of measurement of mutual information is the bit, when logarithms to the base 2 are used.
Read more about Mutual Information: Definition of Mutual Information, Relation To Other Quantities, Variations of Mutual Information, Applications of Mutual Information
Famous quotes containing the words mutual and/or information:
“I have overlived the generation with which mutual labors & perils begat mutual confidence and influence.”
—Thomas Jefferson (17431826)
“On the breasts of a barmaid in Sale
Were tattooed the prices of ale;
And on her behind
For the sake of the blind
Was the same information in Braille.”
—Anonymous.