In probability theory and information theory, the mutual information (sometimes known by the archaic term transinformation) of two random variables is a quantity that measures the mutual dependence of the two random variables. The most common unit of measurement of mutual information is the bit, when logarithms to the base 2 are used.
Read more about Mutual Information: Definition of Mutual Information, Relation To Other Quantities, Variations of Mutual Information, Applications of Mutual Information
Famous quotes containing the words mutual and/or information:
“True love does not quarrel for slight reasons, such mistakes as mutual acquaintances can explain away, but, alas, however slight the apparent cause, only for adequate and fatal and everlasting reasons, which can never be set aside.”
—Henry David Thoreau (18171862)
“So while it is true that children are exposed to more information and a greater variety of experiences than were children of the past, it does not follow that they automatically become more sophisticated. We always know much more than we understand, and with the torrent of information to which young people are exposed, the gap between knowing and understanding, between experience and learning, has become even greater than it was in the past.”
—David Elkind (20th century)