Markov Process — Definition, Formula & Examples
A Markov process is a random process where the probability of what happens next depends only on the current state, not on the sequence of states that came before it. This "memoryless" property is called the Markov property.
A stochastic process satisfies the Markov property if, for all times , the conditional distribution of given equals the conditional distribution of given alone: .
Key Formula
Where:
- = State of the process at time step $n$
- = Transition probability from state $i$ to state $j$
- = States in the state space
How It Works
A Markov process is defined by its set of states and transition probabilities—the chances of moving from one state to another. When the state space is discrete and time advances in steps, the process is called a Markov chain, and transition probabilities are organized into a matrix. To predict the distribution of future states, you multiply the current state vector by the transition matrix repeatedly. The key simplification is that you never need to track history beyond the present state.
Worked Example
Problem: The weather in a town is modeled as a Markov process with two states: Sunny (S) and Rainy (R). If today is sunny, the probability of sun tomorrow is 0.8 and rain is 0.2. If today is rainy, the probability of sun tomorrow is 0.4 and rain is 0.6. Today is sunny. What is the probability it will be rainy two days from now?
Write the transition matrix: Rows represent today's state; columns represent tomorrow's state.
Find the state distribution after day 1: Start with the vector since today is sunny. Multiply by .
Find the state distribution after day 2: Multiply by again.
Answer: The probability it will be rainy two days from now is 0.28 (28%).
Why It Matters
Markov processes underpin Google's PageRank algorithm, speech recognition systems, and financial models of stock prices. In a probability or stochastic processes course, they are the gateway to topics like stationary distributions, ergodic theory, and Monte Carlo simulation.
Common Mistakes
Mistake: Assuming the Markov property means each state transition is independent of everything else.
Correction: Transitions do depend on the current state—they are conditionally independent of the past given the present state, not unconditionally independent.
