Definition
If is the entropy of the variable conditioned on the variable taking a certain value, then is the result of averaging over all possible values that may take.
Given discrete random variable with support and with support, the conditional entropy of given is defined as:
Note: The supports of X and Y can be replaced by their domains if it is understood that should be treated as being equal to zero.
if and only if the value of is completely determined by the value of . Conversely, if and only if and are independent random variables.
Read more about this topic: Conditional Entropy
Famous quotes containing the word definition:
“The very definition of the real becomes: that of which it is possible to give an equivalent reproduction.... The real is not only what can be reproduced, but that which is always already reproduced. The hyperreal.”
—Jean Baudrillard (b. 1929)
“No man, not even a doctor, ever gives any other definition of what a nurse should be than thisdevoted and obedient. This definition would do just as well for a porter. It might even do for a horse. It would not do for a policeman.”
—Florence Nightingale (18201910)
“The definition of good prose is proper words in their proper places; of good verse, the most proper words in their proper places. The propriety is in either case relative. The words in prose ought to express the intended meaning, and no more; if they attract attention to themselves, it is, in general, a fault.”
—Samuel Taylor Coleridge (17721834)