In information theory, the conditional entropy (or equivocation) quantifies the amount of information needed to describe the outcome of a random variable given that the value of another random variable is known. Here, information is measured in bits, nats, or bans. The entropy of conditioned on is written as .
Read more about Conditional Entropy: Definition, Chain Rule, Generalization To Quantum Theory, Other Properties
Famous quotes containing the words conditional and/or entropy:
“Conditional love is love that is turned off and on....Some parents only show their love after a child has done something that pleases them. I love you, honey, for cleaning your room! Children who think they need to earn love become people pleasers, or perfectionists. Those who are raised on conditional love never really feel loved.”
—Louise Hart (20th century)
“Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.”
—Václav Havel (b. 1936)