Quantities of Information - Mutual Information (transinformation)

Mutual Information (transinformation)

It turns out that one of the most useful and important measures of information is the mutual information, or transinformation. This is a measure of how much information can be obtained about one random variable by observing another. The mutual information of relative to (which represents conceptually the average amount of information about that can be gained by observing ) is given by:

A basic property of the mutual information is that:

That is, knowing Y, we can save an average of bits in encoding X compared to not knowing Y. Mutual information is symmetric:


Mutual information can be expressed as the average Kullback–Leibler divergence (information gain) of the posterior probability distribution of X given the value of Y to the prior distribution on X:

In other words, this is a measure of how much, on the average, the probability distribution on X will change if we are given the value of Y. This is often recalculated as the divergence from the product of the marginal distributions to the actual joint distribution:

Mutual information is closely related to the log-likelihood ratio test in the context of contingency tables and the multinomial distribution and to Pearson's χ2 test: mutual information can be considered a statistic for assessing independence between a pair of variables, and has a well-specified asymptotic distribution.

Read more about this topic:  Quantities Of Information

Famous quotes containing the words mutual and/or information:

    Of course I lie to people. But I lie altruistically—for our mutual good. The lie is the basic building block of good manners. That may seem mildly shocking to a moralist—but then what isn’t?
    Quentin Crisp (b. 1908)

    I believe it has been said that one copy of The Times contains more useful information than the whole of the historical works of Thucydides.
    Richard Cobden (1804–1865)