Markov Chain - Formal Definition

Formal Definition

A Markov chain is a sequence of random variables X1, X2, X3, ... with the Markov property, namely that, given the present state, the future and past states are independent. Formally,

The possible values of Xi form a countable set S called the state space of the chain.

Markov chains are often described by a directed graph, where the edges are labeled by the probabilities of going from one state to the other states.

Read more about this topic:  Markov Chain

Famous quotes containing the words formal and/or definition:

    The conviction that the best way to prepare children for a harsh, rapidly changing world is to introduce formal instruction at an early age is wrong. There is simply no evidence to support it, and considerable evidence against it. Starting children early academically has not worked in the past and is not working now.
    David Elkind (20th century)

    One definition of man is “an intelligence served by organs.”
    Ralph Waldo Emerson (1803–1882)