The joint entropy of two discrete random variables and is defined as the entropy of the joint distribution of and :
If and are independent, then the joint entropy is simply the sum of their individual entropies.
(Note: The joint entropy should not be confused with the cross entropy, despite similar notations.)
Read more about this topic: Quantities Of Information
Famous quotes containing the words joint and/or entropy:
“Your letter of excuses has arrived. I receive the letter but do not admit the excuses except in courtesy, as when a man treads on your toes and begs your pardonthe pardon is granted, but the joint aches, especially if there is a corn upon it.”
—George Gordon Noel Byron (17881824)
“Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.”
—Václav Havel (b. 1936)