Joint Entropy - Definition

Definition

The joint entropy of two variables and is defined as

where and are particular values of and, respectively, is the probability of these values occurring together, and is defined to be 0 if .

For more than two variables this expands to

where are particular values of, respectively, is the probability of these values occurring together, and is defined to be 0 if .

Read more about this topic:  Joint Entropy

Famous quotes containing the word definition:

    One definition of man is “an intelligence served by organs.”
    Ralph Waldo Emerson (1803–1882)

    It is very hard to give a just definition of love. The most we can say of it is this: that in the soul, it is a desire to rule; in the spirit, it is a sympathy; and in the body, it is but a hidden and subtle desire to possess—after many mysteries—what one loves.
    François, Duc De La Rochefoucauld (1613–1680)

    The very definition of the real becomes: that of which it is possible to give an equivalent reproduction.... The real is not only what can be reproduced, but that which is always already reproduced. The hyperreal.
    Jean Baudrillard (b. 1929)