Definition
Named after Boltzmann's H-theorem, Shannon denoted the entropy H of a discrete random variable X with possible values {x1, ..., xn} and probability mass function P(X) as,
Here E is the expected value operator, and I is the information content of X.
I(X) is itself a random variable. The entropy can explicitly be written as
where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10, and the unit of entropy is bit for b = 2, nat for b = e, and dit (or digit) for b = 10.
In the case of p(xi) = 0 for some i, the value of the corresponding summand 0 logb 0 is taken to be 0, which is consistent with the well-known limit:
- .
Read more about this topic: Entropy (information Theory)
Famous quotes containing the word definition:
“Im beginning to think that the proper definition of Man is an animal that writes letters.”
—Lewis Carroll [Charles Lutwidge Dodgson] (18321898)
“Was man made stupid to see his own stupidity?
Is God by definition indifferent, beyond us all?
Is the eternal truth mans fighting soul
Wherein the Beast ravens in its own avidity?”
—Richard Eberhart (b. 1904)
“The physicians say, they are not materialists; but they are:MSpirit is matter reduced to an extreme thinness: O so thin!But the definition of spiritual should be, that which is its own evidence. What notions do they attach to love! what to religion! One would not willingly pronounce these words in their hearing, and give them the occasion to profane them.”
—Ralph Waldo Emerson (18031882)