A Markov chain, named after Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process usually characterized as memoryless: the next state depends only on the current state and not on the sequence of events that preceded it. This specific kind of "memorylessness" is called the Markov property. Markov chains have many applications as statistical models of real-world processes.
Read more about Markov Chain: Introduction, Formal Definition, Markov Chains, Finite State Space, Reversible Markov Chain, Bernoulli Scheme, General State Space, Applications, Fitting, History
Famous quotes containing the word chain:
“To avoid tripping on the chain of the past, you have to pick it up and wind it about you.”
—Mason Cooley (b. 1927)