In information theory, the conditional entropy (or equivocation) quantifies the amount of information needed to describe the outcome of a random variable given that the value of another random variable is known. Here, information is measured in bits, nats, or bans. The entropy of conditioned on is written as .
Read more about Conditional Entropy: Definition, Chain Rule, Generalization To Quantum Theory, Other Properties
Famous quotes containing the words conditional and/or entropy:
“The population of the world is a conditional population; these are not the best, but the best that could live in the existing state of soils, gases, animals, and morals: the best that could yet live; there shall be a better, please God.”
—Ralph Waldo Emerson (18031882)
“Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.”
—Václav Havel (b. 1936)