Conditional Independence
In probability theory, two events R and B are conditionally independent given a third event Y precisely if the occurrence or non-occurrence of R and the occurrence or non-occurrence of B are independent events in their conditional probability distribution given Y. In other words, R and B are conditionally independent given Y if and only if, given knowledge that Y occurs, knowledge of whether R occurs provides no information on the likelihood of B occurring, and knowledge of whether B occurs provides no information on the likehood of R occurring.
In the standard notation of probability theory, R and B are conditionally independent given Y if and only if
or equivalently,
Two random variables X and Y are conditionally independent given a third random variable Z if and only if they are independent in their conditional probability distribution given Z. That is, X and Y are conditionally independent given Z if and only if, given any value of Z, the probability distribution of X is the same for all values of Y and the probability distribution of Y is the same for all values of X.
Two events R and B are conditionally independent given a σ-algebra Σ if
where denotes the conditional expectation of the indicator function of the event, given the sigma algebra . That is,
Two random variables X and Y are conditionally independent given a σ-algebra Σ if the above equation holds for all R in σ(X) and B in σ(Y).
Two random variables X and Y are conditionally independent given a random variable W if they are independent given σ(W): the σ-algebra generated by W. This is commonly written:
- or
This is read "X is independent of Y, given W"; the conditioning applies to the whole statement: "(X is independent of Y) given W".
If W assumes a countable set of values, this is equivalent to the conditional independence of X and Y for the events of the form . Conditional independence of more than two events, or of more than two random variables, is defined analogously.
The following two examples show that X ⊥ Y neither implies nor is implied by X ⊥ Y | W. First, suppose W is 0 with probability 0.5 and is the value 1 otherwise. When W = 0 take X and Y to be independent, each having the value 0 with probability 0.99 and the value 1 otherwise. When W = 1, X and Y are again independent, but this time they take the value 1 with probability 0.99. Then X ⊥ Y | W. But X and Y are dependent, because Pr(X = 0) < Pr(X = 0|Y = 0). This is because Pr(X = 0) = 0.5, but if Y = 0 then it's very likely that W = 0 and thus that X = 0 as well, so Pr(X = 0|Y = 0) > 0.5. For the second example, suppose X ⊥ Y, each taking the values 0 and 1 with probability 0.5. Let W be the product X×Y. Then when W = 0, Pr(X = 0) = 2/3, but Pr(X = 0|Y = 0) = 1/2, so X ⊥ Y | W is false. This is also an example of Explaining Away. See Kevin Murphy's tutorial where X and Y take the values "brainy" and "sporty".
Read more about Conditional Independence: Uses in Bayesian Inference, Rules of Conditional Independence
Famous quotes containing the words conditional and/or independence:
“The population of the world is a conditional population; these are not the best, but the best that could live in the existing state of soils, gases, animals, and morals: the best that could yet live; there shall be a better, please God.”
—Ralph Waldo Emerson (18031882)
“Independence I have long considered as the grand blessing of life, the basis of every virtue; and independence I will ever secure by contracting my wants, though I were to live on a barren heath.”
—Mary Wollstonecraft (17591797)