Definition
Named after Boltzmann's H-theorem, Shannon denoted the entropy H of a discrete random variable X with possible values {x1, ..., xn} and probability mass function P(X) as,
Here E is the expected value operator, and I is the information content of X.
I(X) is itself a random variable. The entropy can explicitly be written as
where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10, and the unit of entropy is bit for b = 2, nat for b = e, and dit (or digit) for b = 10.
In the case of p(xi) = 0 for some i, the value of the corresponding summand 0 logb 0 is taken to be 0, which is consistent with the well-known limit:
- .
Read more about this topic: Entropy (information Theory)
Famous quotes containing the word definition:
“It is very hard to give a just definition of love. The most we can say of it is this: that in the soul, it is a desire to rule; in the spirit, it is a sympathy; and in the body, it is but a hidden and subtle desire to possessafter many mysterieswhat one loves.”
—François, Duc De La Rochefoucauld (16131680)
“The very definition of the real becomes: that of which it is possible to give an equivalent reproduction.... The real is not only what can be reproduced, but that which is always already reproduced. The hyperreal.”
—Jean Baudrillard (b. 1929)
“No man, not even a doctor, ever gives any other definition of what a nurse should be than thisdevoted and obedient. This definition would do just as well for a porter. It might even do for a horse. It would not do for a policeman.”
—Florence Nightingale (18201910)