Entropy Rate

In the mathematical theory of probability, the entropy rate or source information rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate H(X) is the limit of the joint entropy of n members of the process Xk divided by n, as n tends to infinity:

when the limit exists. An alternative, related quantity is:

For strongly stationary stochastic processes, . The entropy rate can be thought of as a general property of stochastic sources; this is the asymptotic equipartition property.

Read more about Entropy Rate:  Entropy Rates For Markov Chains, Example

Famous quotes containing the words entropy and/or rate:

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)

    This is the essential distinction—even opposition—between the painting and the film: the painting is composed subjectively, the film objectively. However highly we rate the function of the scenario writer—in actual practice it is rated very low—we must recognize that the film is not transposed directly and freely from the mind by means of a docile medium like paint, but must be cut piece-meal out of the lumbering material of the actual visible world.
    Sir Herbert Read (1893–1968)