# What is a Markov model?

Markov Models

• A way to exploit special sequential aspect (e.g. correlations between observations that are close in the sequence).
• For example, rainy day or not. If we treat the data as i.i.d., then the only information we glean from the data is the frequency of rainy days without any weather trends that last few days. Therefore, knowing whether or not it rains today helps to predict tomorrow’s weather.
• A Markov Model is one of the simplest ways to relax such i.i.d. assumptions and express such effects in a probabilistic model.

Joint distribution for Markov chain

• General expression of the joint distribution for a sequence of observations $p(X_1, ..., X_N) = \displaystyle\prod_{n=2}^{N} p(X_n|X_1, ..., X_{n-1})$

• First-order Markov chain

If we assume that each of the conditional probability on the right-hand side is independent of all previous observations except the most recent one, we obtain the first-order Markov chain. $p(X_1, ..., X_N) = p(X_1)\displaystyle\prod_{n=2}^{N} p(X_n|X_{n-1})$

In other words, the conditional distribution for observation $X_n$ is given by $p(X_n|X_1, ..., X_{n-1}) = p(X_n|X_{n-1})$

• Second-order markov chain

Likewise, a second order Markov chain, [Figure] $p(X_n|X_1, ..., X_{n-1}) = p(X_n|X_{n-1}, X_{n-2})$

which menas each observation is only influenced by two previous observations and independent of all observations before.