Motivation
For simplicity, it will be assumed that all objects in the article are finite dimensional.
We first discuss the classical case. Suppose the probabilities of a finite sequence of events is given by the probability distribution P = {p1...pn}, but somehow we mistakenly assumed it to be Q = {q1...qn}. For instance, we can mistake an unfair coin for a fair one. According to this erroneous assumption, our uncertainty about the j-th event, or equivalently, the amount of information provided after observing the j-th event, is
The (assumed) average uncertainty of all possible events is then
On the other hand, the Shannon entropy of the probability distribution p, defined by
is the real amount of uncertainty before observation. Therefore the difference between these two quantities
is a measure of the distinguishability of the two probability distributions p and q. This is precisely the classical relative entropy, or Kullback–Leibler divergence:
Note
- In the definitions above, the convention that 0·log 0 = 0 is assumed, since limx → 0 x log x = 0. Intuitively, one would expect that an event of zero probability to contribute nothing towards entropy.
- The relative entropy is not a metric. For example, it is not symmetric. The uncertainty discrepancy in mistaking a fair coin to be unfair is not the same as the opposite situation.
Read more about this topic: Quantum Relative Entropy
Famous quotes containing the word motivation:
“Self-determination has to mean that the leader is your individual gut, and heart, and mind or were talking about power, again, and its rather well-known impurities. Who is really going to care whether you live or die and who is going to know the most intimate motivation for your laughter and your tears is the only person to be trusted to speak for you and to decide what you will or will not do.”
—June Jordan (b. 1939)