Joint Entropy - Definition

Definition

The joint entropy of two variables and is defined as

where and are particular values of and, respectively, is the probability of these values occurring together, and is defined to be 0 if .

For more than two variables this expands to

where are particular values of, respectively, is the probability of these values occurring together, and is defined to be 0 if .

Read more about this topic:  Joint Entropy

Famous quotes containing the word definition:

    ... if, as women, we accept a philosophy of history that asserts that women are by definition assimilated into the male universal, that we can understand our past through a male lens—if we are unaware that women even have a history—we live our lives similarly unanchored, drifting in response to a veering wind of myth and bias.
    Adrienne Rich (b. 1929)

    Scientific method is the way to truth, but it affords, even in
    principle, no unique definition of truth. Any so-called pragmatic
    definition of truth is doomed to failure equally.
    Willard Van Orman Quine (b. 1908)

    The man who knows governments most completely is he who troubles himself least about a definition which shall give their essence. Enjoying an intimate acquaintance with all their particularities in turn, he would naturally regard an abstract conception in which these were unified as a thing more misleading than enlightening.
    William James (1842–1910)