Entropy Rate

In the mathematical theory of probability, the entropy rate or source information rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate H(X) is the limit of the joint entropy of n members of the process Xk divided by n, as n tends to infinity:

when the limit exists. An alternative, related quantity is:

For strongly stationary stochastic processes, . The entropy rate can be thought of as a general property of stochastic sources; this is the asymptotic equipartition property.

Read more about Entropy Rate:  Entropy Rates For Markov Chains, Example

Famous quotes containing the words entropy and/or rate:

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)

    Writing a book I have found to be like building a house. A man forms a plan, and collects materials. He thinks he has enough to raise a large and stately edifice; but after he has arranged, compacted and polished, his work turns out to be a very small performance. The authour however like the builder, knows how much labour his work has cost him; and therefore estimates it at a higher rate than other people think it deserves,
    James Boswell (1740–1795)