Mathwords logoMathwords

Dynamical System — Definition, Formula & Examples

A dynamical system is a mathematical model that describes how the state of a system evolves over time according to a fixed rule. The rule can operate in discrete time steps (like a recurrence relation) or in continuous time (like a differential equation).

A dynamical system is a tuple (T,X,Φ)(T, X, \Phi) where TT is a time set (typically Z\mathbb{Z} or R\mathbb{R}), XX is a state space, and Φ:T×XX\Phi: T \times X \to X is an evolution operator satisfying Φ(0,x)=x\Phi(0, x) = x and Φ(t+s,x)=Φ(t,Φ(s,x))\Phi(t+s, x) = \Phi(t, \Phi(s, x)) for all t,sTt, s \in T and xXx \in X.

Key Formula

xn+1=f(xn)(discrete)dxdt=f(x)(continuous)x_{n+1} = f(x_n) \quad \text{(discrete)} \qquad \frac{dx}{dt} = f(x) \quad \text{(continuous)}
Where:
  • xnx_n = State of the system at time step n
  • ff = Evolution rule (map or vector field)
  • tt = Continuous time parameter

How It Works

You start with an initial state x0x_0 and apply the evolution rule repeatedly (discrete case) or continuously. In a discrete dynamical system, the rule is an iterated map xn+1=f(xn)x_{n+1} = f(x_n). In a continuous dynamical system, the rule is typically a system of ODEs x˙=f(x)\dot{x} = f(x), and the solution flow gives the state at any future time. The central questions are: does the system settle to a fixed point, oscillate periodically, or behave chaotically?

Worked Example

Problem: Consider the discrete dynamical system defined by xn+1=2xn(1xn)x_{n+1} = 2x_n(1 - x_n) with initial state x0=0.25x_0 = 0.25. Find x1x_1, x2x_2, and x3x_3, and determine whether the system approaches a fixed point.
Step 1: Compute x1x_1 by substituting x0=0.25x_0 = 0.25 into the map.
x1=2(0.25)(10.25)=2(0.25)(0.75)=0.375x_1 = 2(0.25)(1 - 0.25) = 2(0.25)(0.75) = 0.375
Step 2: Compute x2x_2 using x1=0.375x_1 = 0.375.
x2=2(0.375)(0.625)=0.46875x_2 = 2(0.375)(0.625) = 0.46875
Step 3: Compute x3x_3 using x2=0.46875x_2 = 0.46875.
x3=2(0.46875)(0.53125)=0.498046875x_3 = 2(0.46875)(0.53125) = 0.498046875
Step 4: Find the fixed points by solving x=2x(1x)x^* = 2x^*(1 - x^*). This gives x=0x^* = 0 or x=0.5x^* = 0.5. The iterates are converging toward x=0.5x^* = 0.5.
x=2x(1x)    x=0   or   x=12x^* = 2x^*(1-x^*) \implies x^* = 0 \;\text{ or }\; x^* = \tfrac{1}{2}
Answer: x1=0.375x_1 = 0.375, x2=0.46875x_2 = 0.46875, x30.498x_3 \approx 0.498. The system converges to the stable fixed point x=0.5x^* = 0.5.

Why It Matters

Dynamical systems are foundational in courses on differential equations, chaos theory, and mathematical modeling. They model real phenomena ranging from population dynamics in ecology to orbital mechanics in aerospace engineering. Understanding fixed points, stability, and bifurcations is essential for analyzing any system that evolves over time.

Common Mistakes

Mistake: Confusing a fixed point with a stable fixed point. Students find where f(x)=xf(x^*) = x^* and assume the system always converges there.
Correction: A fixed point can be unstable (repelling). Check stability by computing f(x)|f'(x^*)|: the fixed point is stable only if f(x)<1|f'(x^*)| < 1 in the discrete case.