Markov Chain
Markov Chain
In this series of posts, we will try to understand an interesting model , which can be applied in various ways for numerous applications. Markov Chain model is an important model and one should have a good understanding of it hence, we will try to go a bit deep and see what it actually is.
Markov chain is a stochastic model.
stochastic means having a random probability distribution whose statistical properties can be analysed but precise prediction is not possible.
In Markov chain we can only predict the probability of the next event with the knowledge of current event. i.e The current state give's the probability of the next state.
A stochastic process is a Markov process if it satisfies Markov Property.
Markov Property: If in any stochastic process, the conditional probability distribution of future states only depends upon the present state and not on the sequence of events before it.
In Simple words, Markov process are those process in which the present state decides the next state without any influence of the previous state. The below diagram depicts a simple Markov chain where the values on the link indicates the probability of transition to the next state.
Now that we know what are Markov process, let us define Markov chain:
Markov chain is a chain and is a type of Markov process with finite or countable state space.
The changes of state in a Markov chain are called transition and the probability associated with those transition are known as transition probabilities. We can express a Markov chain with the use of transition matrix with describing the probability in the transition.
Transition Matrix(Stochastic Matrix):
A Stochastic matrix or transition matrix is a matrix used to define the transitions in a Markov Chain. In a Stochastic matrix, each entry is a non negative real number which represents the probability of the state change from a to x, where a is the current state which is given as a row and x is the next possible state reachable from a, this type of stochastic matrix is also known as a right stochastic matrix in which the probability in each row sum's up to 1. Similarly, one can define the transition the other way around i.e. transition where column denotes the current state and the values in each row denotes the transition probability for the jump from current state, This type of transition matrix is coined as a left stochastic matrix. In left stochastic matrix, the probability in each column sums up to 1. These kind of matrix have various names such as probability matrix, transition matrix, stochastic matrix, Markov matrix or substitution matrix.
Applications of Markov Chain:
1.Gamblers Ruin:
A persistent gambler who raises his bet by a fixed fraction of amount when he wins but does not reduce the amount after losing will eventually lose all its wealth, this is known as Gamblers ruin and it can be modeled with the help of Markov Chain.
2. Birth Death Process:
A Birth Death process is a process in which the state transitions are of only two types : Birth and Death. When the transition is a birth, the state is incremented and when the transition is of type death then the state is decremented. This process can be modeled with the help of Markov Chains. Birth Death process have various applications such as performance engineering, demography, etc.
Properties Related To Markov Chain:
Reducibility:
Reducible Markov chains describe a system that have particular states such that once that state is reached we cannot visit other states, one fine example for it be gamblers ruin wherein if one gambler becomes broke he wont be able to play the game anymore.
Similarly, Irreducible Markov chains are those chains in which starting at any state, we can reach any other states in a direct or an indirect manner.
Periodicity:
If in a Markov chain there exits a state such that from that state if we want to reach the same state again we will always have to make hops in multiples of k, then that state is said to be periodic with period k.
Transient:
Given a state k, that state is said to be a transient state if there is non zero probability that we will never reach that state. i.e. those states such that after exiting those states, we may never visit those states again.
Recurrent:
Given a state k, that state is said to be a recurrent state if there is zero probability that we will never reach that state. i.e. after exiting the state , we can say with some probability that we will at some point reach the same state again, or that state is expected at some further instance.
Ergodicity:
A state is ergodic if it is periodic with period 1 and is also recurrent.
Reducible Markov chains describe a system that have particular states such that once that state is reached we cannot visit other states, one fine example for it be gamblers ruin wherein if one gambler becomes broke he wont be able to play the game anymore.
Similarly, Irreducible Markov chains are those chains in which starting at any state, we can reach any other states in a direct or an indirect manner.
Periodicity:
If in a Markov chain there exits a state such that from that state if we want to reach the same state again we will always have to make hops in multiples of k, then that state is said to be periodic with period k.
Transient:
Given a state k, that state is said to be a transient state if there is non zero probability that we will never reach that state. i.e. those states such that after exiting those states, we may never visit those states again.
Recurrent:
Given a state k, that state is said to be a recurrent state if there is zero probability that we will never reach that state. i.e. after exiting the state , we can say with some probability that we will at some point reach the same state again, or that state is expected at some further instance.
Ergodicity:
A state is ergodic if it is periodic with period 1 and is also recurrent.
Written By,
Sarvesh Bhatnagar