In probability theory and information theory, the mutual information (sometimes known by the archaic term transinformation) of two random variables is a quantity that measures the mutual dependence of the two random variables. The most common unit of measurement of mutual information is the bit, when logarithms to the base 2 are used.
Read more about Mutual Information: Definition of Mutual Information, Relation To Other Quantities, Variations of Mutual Information, Applications of Mutual Information
Famous quotes containing the words mutual and/or information:
“Then, anger
was a crease in the brow
and silence
a catastrophe.
Then, making up
was a mutual smile
and a glance
a gift.
Now, just look at this mess
that youve made of that love.
You grovel at my feet
and I berate you
and cant let my anger go.”
—Amaru (c. seventh century A.D.)
“I have all my life been on my guard against the information conveyed by the sense of hearingit being one of my earliest observations, the universal inclination of humankind is to be led by the ears, and I am sometimes apt to imagine that they are given to men as they are to pitchers, purposely that they may be carried about by them.”
—Mary Wortley, Lady Montagu (16891762)