Definition
Named after Boltzmann's H-theorem, Shannon denoted the entropy H of a discrete random variable X with possible values {x1, ..., xn} and probability mass function P(X) as,
Here E is the expected value operator, and I is the information content of X.
I(X) is itself a random variable. The entropy can explicitly be written as
where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10, and the unit of entropy is bit for b = 2, nat for b = e, and dit (or digit) for b = 10.
In the case of p(xi) = 0 for some i, the value of the corresponding summand 0 logb 0 is taken to be 0, which is consistent with the well-known limit:
- .
Read more about this topic: Entropy (information Theory)
Famous quotes containing the word definition:
“The definition of good prose is proper words in their proper places; of good verse, the most proper words in their proper places. The propriety is in either case relative. The words in prose ought to express the intended meaning, and no more; if they attract attention to themselves, it is, in general, a fault.”
—Samuel Taylor Coleridge (17721834)
“The physicians say, they are not materialists; but they are:MSpirit is matter reduced to an extreme thinness: O so thin!But the definition of spiritual should be, that which is its own evidence. What notions do they attach to love! what to religion! One would not willingly pronounce these words in their hearing, and give them the occasion to profane them.”
—Ralph Waldo Emerson (18031882)
“Its a rare parent who can see his or her child clearly and objectively. At a school board meeting I attended . . . the only definition of a gifted child on which everyone in the audience could agree was mine.”
—Jane Adams (20th century)