Chain Rule
Assume that the combined system determined by two random variables X and Y has entropy, that is, we need bits of information to describe its exact state. Now if we first learn the value of, we have gained bits of information. Once is known, we only need bits to describe the state of the whole system. This quantity is exactly, which gives the chain rule of conditional probability:
Formally, the chain rule indeed follows from the above definition of conditional probability:
Read more about this topic: Conditional Entropy
Famous quotes containing the words chain and/or rule:
“The years seemed to stretch before her like the land: spring, summer, autumn, winter, spring; always the same patient fields, the patient little trees, the patient lives; always the same yearning; the same pulling at the chainuntil the instinct to live had torn itself and bled and weakened for the last time, until the chain secured a dead woman, who might cautiously be released.”
—Willa Cather (18731947)
“Rules and particular inferences alike are justified by being brought into agreement with each other. A rule is amended if it yields an inference we are unwilling to accept; an inference is rejected if it violates a rule we are unwilling to amend. The process of justification is the delicate one of making mutual adjustments between rules and accepted inferences; and in the agreement achieved lies the only justification needed for either.”
—Nelson Goodman (b. 1906)