Joint Entropy - Definition

Definition

The joint entropy of two variables and is defined as

where and are particular values of and, respectively, is the probability of these values occurring together, and is defined to be 0 if .

For more than two variables this expands to

where are particular values of, respectively, is the probability of these values occurring together, and is defined to be 0 if .

Read more about this topic:  Joint Entropy

Famous quotes containing the word definition:

    One definition of man is “an intelligence served by organs.”
    Ralph Waldo Emerson (1803–1882)

    No man, not even a doctor, ever gives any other definition of what a nurse should be than this—”devoted and obedient.” This definition would do just as well for a porter. It might even do for a horse. It would not do for a policeman.
    Florence Nightingale (1820–1910)

    It is very hard to give a just definition of love. The most we can say of it is this: that in the soul, it is a desire to rule; in the spirit, it is a sympathy; and in the body, it is but a hidden and subtle desire to possess—after many mysteries—what one loves.
    François, Duc De La Rochefoucauld (1613–1680)