Markov Chain - Markov Chains

Markov Chains

The probability of going from state i to state j in n time steps is

and the single-step transition is

For a time-homogeneous Markov chain:

and

The n-step transition probabilities satisfy the Chapman–Kolmogorov equation, that for any k such that 0 < k < n,

where S is the state space of the Markov chain.

The marginal distribution Pr(Xn = x) is the distribution over states at time n. The initial distribution is Pr(X0 = x). The evolution of the process through one time step is described by

Note: The superscript (n) is an index and not an exponent.

Read more about this topic:  Markov Chain

Famous quotes containing the word chains:

    He that is taken and put into prison or chains is not conquered, though overcome; for he is still an enemy.
    Thomas Hobbes (1588–1679)