Entropy (information Theory) - Definition

Definition

Named after Boltzmann's H-theorem, Shannon denoted the entropy H of a discrete random variable X with possible values {x1, ..., xn} and probability mass function P(X) as,

Here E is the expected value operator, and I is the information content of X.

I(X) is itself a random variable. The entropy can explicitly be written as

where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10, and the unit of entropy is bit for b = 2, nat for b = e, and dit (or digit) for b = 10.

In the case of p(xi) = 0 for some i, the value of the corresponding summand 0 logb 0 is taken to be 0, which is consistent with the well-known limit:

.

Read more about this topic:  Entropy (information Theory)

Famous quotes containing the word definition:

    The physicians say, they are not materialists; but they are:MSpirit is matter reduced to an extreme thinness: O so thin!—But the definition of spiritual should be, that which is its own evidence. What notions do they attach to love! what to religion! One would not willingly pronounce these words in their hearing, and give them the occasion to profane them.
    Ralph Waldo Emerson (1803–1882)

    ... if, as women, we accept a philosophy of history that asserts that women are by definition assimilated into the male universal, that we can understand our past through a male lens—if we are unaware that women even have a history—we live our lives similarly unanchored, drifting in response to a veering wind of myth and bias.
    Adrienne Rich (b. 1929)

    According to our social pyramid, all men who feel displaced racially, culturally, and/or because of economic hardships will turn on those whom they feel they can order and humiliate, usually women, children, and animals—just as they have been ordered and humiliated by those privileged few who are in power. However, this definition does not explain why there are privileged men who behave this way toward women.
    Ana Castillo (b. 1953)