Joint Entropy

Joint entropy is a measure of the uncertainty associated with a set of variables.

Read more about Joint Entropy:  Definition, Relations To Other Entropy Measures

Famous quotes containing the words joint and/or entropy:

    There is no such thing as “the Queen’s English.” The property has gone into the hands of a joint stock company and we own the bulk of the shares!
    Mark Twain [Samuel Langhorne Clemens] (1835–1910)

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)