Quantum Mutual Information - Motivation

Motivation

For simplicity, it will be assumed that all objects in the article are finite dimensional.

The definition of quantum mutual entropy is motivated by the classical case. For a probability distribution of two variables p(x, y), the two marginal distributions are

The classical mutual information I(X, Y) is defined by

where S(q) denotes the Shannon entropy of the probability distribution q.

One can calculate directly


\; = -(\sum_x \; ( \sum_{y'} p(x,y') \log \sum_{y'} p(x,y') ) + \sum_y ( \sum_{x'} p(x',y) \log \sum_{x'} p(x',y)))

So the mutual information is

But this is precisely the relative entropy between p(x, y) and p(x)p(y). In other words, if we assume the two variables x and y to be uncorrelated, mutual information is the discrepancy in uncertainty resulting from this (possibly erroneous) assumption.

It follows from the property of relative entropy that I(X,Y) ≥ 0 and equality holds if and only if p(x, y) = p(x)p(y).

Read more about this topic:  Quantum Mutual Information

Famous quotes containing the word motivation:

    Self-determination has to mean that the leader is your individual gut, and heart, and mind or we’re talking about power, again, and its rather well-known impurities. Who is really going to care whether you live or die and who is going to know the most intimate motivation for your laughter and your tears is the only person to be trusted to speak for you and to decide what you will or will not do.
    June Jordan (b. 1939)