Entropy Rate - Entropy Rates For Markov Chains

Entropy Rates For Markov Chains

Since a stochastic process defined by a Markov chain that is irreducible and aperiodic has a stationary distribution, the entropy rate is independent of the initial distribution.

For example, for such a Markov chain Yk defined on a countable number of states, given the transition matrix Pij, H(Y) is given by:

where μi is the stationary distribution of the chain.

A simple consequence of this definition is that the entropy rate of an i.i.d. stochastic process has an entropy rate that is the same as the entropy of any individual member of the process.

Read more about this topic:  Entropy Rate

Famous quotes containing the words entropy, rates and/or chains:

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)

    One of the most important findings to come out of our research is that being where you want to be is good for you. We found a very strong correlation between preferring the role you are in and well-being. The homemaker who is at home because she likes that “job,” because it meets her own desires and needs, tends to feel good about her life. The woman at work who wants to be there also rates high in well-being.
    Grace Baruch (20th century)

    He that has his chains knocked off, and the prison doors set open to him, is perfectly at liberty, because he may either go or stay, as he best likes; though his preference be determined to stay, by the darkness of the night, or illness of the weather, or want of other lodging. He ceases not to be free, though the desire of some convenience to be had there absolutely determines his preference, and makes him stay in his prison.
    John Locke (1632–1704)