Definition
Named after Boltzmann's H-theorem, Shannon denoted the entropy H of a discrete random variable X with possible values {x1, ..., xn} and probability mass function P(X) as,
Here E is the expected value operator, and I is the information content of X.
I(X) is itself a random variable. The entropy can explicitly be written as
where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10, and the unit of entropy is bit for b = 2, nat for b = e, and dit (or digit) for b = 10.
In the case of p(xi) = 0 for some i, the value of the corresponding summand 0 logb 0 is taken to be 0, which is consistent with the well-known limit:
- .
Read more about this topic: Entropy (information Theory)
Famous quotes containing the word definition:
“Perhaps the best definition of progress would be the continuing efforts of men and women to narrow the gap between the convenience of the powers that be and the unwritten charter.”
—Nadine Gordimer (b. 1923)
“Scientific method is the way to truth, but it affords, even in
principle, no unique definition of truth. Any so-called pragmatic
definition of truth is doomed to failure equally.”
—Willard Van Orman Quine (b. 1908)
“... if, as women, we accept a philosophy of history that asserts that women are by definition assimilated into the male universal, that we can understand our past through a male lensif we are unaware that women even have a historywe live our lives similarly unanchored, drifting in response to a veering wind of myth and bias.”
—Adrienne Rich (b. 1929)