Variants
As described above, differential entropy does not share all properties of discrete entropy. A modification of differential entropy adds an invariant measure factor to correct this, (see limiting density of discrete points). If m(x) is further constrained to be a probability density, the resulting notion is called relative entropy in information theory:
The definition of differential entropy above can be obtained by partitioning the range of X into bins of length h with associated sample points ih within the bins, for X Riemann integrable. This gives a quantized version of X, defined by Xh = ih if ih ≤ X ≤ (i+1)h. Then the entropy of Xh is
The first term on the right approximates the differential entropy, while the second term is approximately −log(h). Note that this procedure suggests that the entropy in the discrete sense of a continuous random variable should be ∞.
Read more about this topic: Differential Entropy
Famous quotes containing the word variants:
“Nationalist pride, like other variants of pride, can be a substitute for self-respect.”
—Eric Hoffer (19021983)