Mathwords logoMathwords

Markov Process — Definition, Formula & Examples

A Markov process is a random process where the probability of what happens next depends only on the current state, not on the sequence of states that came before it. This "memoryless" property is called the Markov property.

A stochastic process {Xt:tT}\{X_t : t \in T\} satisfies the Markov property if, for all times t1<t2<<tn<tn+1t_1 < t_2 < \cdots < t_n < t_{n+1}, the conditional distribution of Xtn+1X_{t_{n+1}} given Xt1,Xt2,,XtnX_{t_1}, X_{t_2}, \ldots, X_{t_n} equals the conditional distribution of Xtn+1X_{t_{n+1}} given XtnX_{t_n} alone: P(Xtn+1xXt1,,Xtn)=P(Xtn+1xXtn)P(X_{t_{n+1}} \leq x \mid X_{t_1}, \ldots, X_{t_n}) = P(X_{t_{n+1}} \leq x \mid X_{t_n}).

Key Formula

P(Xn+1=jXn=i)=pijP(X_{n+1} = j \mid X_n = i) = p_{ij}
Where:
  • XnX_n = State of the process at time step $n$
  • pijp_{ij} = Transition probability from state $i$ to state $j$
  • i,ji, j = States in the state space

How It Works

A Markov process is defined by its set of states and transition probabilities—the chances of moving from one state to another. When the state space is discrete and time advances in steps, the process is called a Markov chain, and transition probabilities are organized into a matrix. To predict the distribution of future states, you multiply the current state vector by the transition matrix repeatedly. The key simplification is that you never need to track history beyond the present state.

Worked Example

Problem: The weather in a town is modeled as a Markov process with two states: Sunny (S) and Rainy (R). If today is sunny, the probability of sun tomorrow is 0.8 and rain is 0.2. If today is rainy, the probability of sun tomorrow is 0.4 and rain is 0.6. Today is sunny. What is the probability it will be rainy two days from now?
Write the transition matrix: Rows represent today's state; columns represent tomorrow's state.
P=(0.80.20.40.6)P = \begin{pmatrix} 0.8 & 0.2 \\ 0.4 & 0.6 \end{pmatrix}
Find the state distribution after day 1: Start with the vector π0=(1,0)\pi_0 = (1, 0) since today is sunny. Multiply by PP.
π1=π0P=(10.8+00.4,  10.2+00.6)=(0.8,  0.2)\pi_1 = \pi_0 \cdot P = (1 \cdot 0.8 + 0 \cdot 0.4,\; 1 \cdot 0.2 + 0 \cdot 0.6) = (0.8,\; 0.2)
Find the state distribution after day 2: Multiply π1\pi_1 by PP again.
π2=(0.80.8+0.20.4,  0.80.2+0.20.6)=(0.72,  0.28)\pi_2 = (0.8 \cdot 0.8 + 0.2 \cdot 0.4,\; 0.8 \cdot 0.2 + 0.2 \cdot 0.6) = (0.72,\; 0.28)
Answer: The probability it will be rainy two days from now is 0.28 (28%).

Why It Matters

Markov processes underpin Google's PageRank algorithm, speech recognition systems, and financial models of stock prices. In a probability or stochastic processes course, they are the gateway to topics like stationary distributions, ergodic theory, and Monte Carlo simulation.

Common Mistakes

Mistake: Assuming the Markov property means each state transition is independent of everything else.
Correction: Transitions do depend on the current state—they are conditionally independent of the past given the present state, not unconditionally independent.