Entropy (information Theory) - Definition

Definition

Named after Boltzmann's H-theorem, Shannon denoted the entropy H of a discrete random variable X with possible values {x1, ..., xn} and probability mass function P(X) as,

Here E is the expected value operator, and I is the information content of X.

I(X) is itself a random variable. The entropy can explicitly be written as

where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10, and the unit of entropy is bit for b = 2, nat for b = e, and dit (or digit) for b = 10.

In the case of p(xi) = 0 for some i, the value of the corresponding summand 0 logb 0 is taken to be 0, which is consistent with the well-known limit:

.

Read more about this topic:  Entropy (information Theory)

Famous quotes containing the word definition:

    Scientific method is the way to truth, but it affords, even in
    principle, no unique definition of truth. Any so-called pragmatic
    definition of truth is doomed to failure equally.
    Willard Van Orman Quine (b. 1908)

    The man who knows governments most completely is he who troubles himself least about a definition which shall give their essence. Enjoying an intimate acquaintance with all their particularities in turn, he would naturally regard an abstract conception in which these were unified as a thing more misleading than enlightening.
    William James (1842–1910)

    According to our social pyramid, all men who feel displaced racially, culturally, and/or because of economic hardships will turn on those whom they feel they can order and humiliate, usually women, children, and animals—just as they have been ordered and humiliated by those privileged few who are in power. However, this definition does not explain why there are privileged men who behave this way toward women.
    Ana Castillo (b. 1953)