In probability theory and information theory, the mutual information (sometimes known by the archaic term transinformation) of two random variables is a quantity that measures the mutual dependence of the two random variables. The most common unit of measurement of mutual information is the bit, when logarithms to the base 2 are used.
Read more about Mutual Information: Definition of Mutual Information, Relation To Other Quantities, Variations of Mutual Information, Applications of Mutual Information
Famous quotes containing the words mutual and/or information:
“If the study of all these sciences, which we have enumerated, should ever bring us to their mutual association and relationship, and teach us the nature of the ties which bind them together, I believe that the diligent treatment of them will forward the objects which we have in view, and that the labor, which otherwise would be fruitless, will be well bestowed.”
—Plato (c. 427347 B.C.)
“Many more children observe attitudes, values and ways different from or in conflict with those of their families, social networks, and institutions. Yet todays young people are no more mature or capable of handling the increased conflicting and often stimulating information they receive than were young people of the past, who received the information and had more adult control of and advice about the information they did receive.”
—James P. Comer (20th century)